Auto merge of #64946 - Centril:rollup-66mj5o0, r=Centril

Rollup of 10 pull requests

Successful merges:

 - #63674 (syntax: Support modern attribute syntax in the `meta` matcher)
 - #63931 (Stabilize macros in some more positions)
 - #64887 (syntax: recover trailing `|` in or-patterns)
 - #64895 (async/await: improve not-send errors)
 - #64896 (Remove legacy grammar)
 - #64907 (A small amount of tidying-up factored out from PR #64648)
 - #64928 (Add tests for some issues)
 - #64930 (Silence unreachable code lint from await desugaring)
 - #64935 (Improve code clarity)
 - #64937 (Deduplicate closure type errors)

Failed merges:

r? @ghost
This commit is contained in:
bors 2019-10-01 07:56:52 +00:00
commit 702b45e409
103 changed files with 1290 additions and 4574 deletions

View File

@ -1,812 +1,7 @@
% Grammar
# Introduction
The Rust grammar may now be found in the [reference]. Additionally, the [grammar
working group] is working on producing a testable grammar.
This document is the primary reference for the Rust programming language grammar. It
provides only one kind of material:
- Chapters that formally define the language grammar.
This document does not serve as an introduction to the language. Background
familiarity with the language is assumed. A separate [guide] is available to
help acquire such background.
This document also does not serve as a reference to the [standard] library
included in the language distribution. Those libraries are documented
separately by extracting documentation attributes from their source code. Many
of the features that one might expect to be language features are library
features in Rust, so what you're looking for may be there, not here.
[guide]: guide.html
[standard]: std/index.html
# Notation
Rust's grammar is defined over Unicode codepoints, each conventionally denoted
`U+XXXX`, for 4 or more hexadecimal digits `X`. _Most_ of Rust's grammar is
confined to the ASCII range of Unicode, and is described in this document by a
dialect of Extended Backus-Naur Form (EBNF), specifically a dialect of EBNF
supported by common automated LL(k) parsing tools such as `llgen`, rather than
the dialect given in ISO 14977. The dialect can be defined self-referentially
as follows:
```antlr
grammar : rule + ;
rule : nonterminal ':' productionrule ';' ;
productionrule : production [ '|' production ] * ;
production : term * ;
term : element repeats ;
element : LITERAL | IDENTIFIER | '[' productionrule ']' ;
repeats : [ '*' | '+' ] NUMBER ? | NUMBER ? | '?' ;
```
Where:
- Whitespace in the grammar is ignored.
- Square brackets are used to group rules.
- `LITERAL` is a single printable ASCII character, or an escaped hexadecimal
ASCII code of the form `\xQQ`, in single quotes, denoting the corresponding
Unicode codepoint `U+00QQ`.
- `IDENTIFIER` is a nonempty string of ASCII letters and underscores.
- The `repeat` forms apply to the adjacent `element`, and are as follows:
- `?` means zero or one repetition
- `*` means zero or more repetitions
- `+` means one or more repetitions
- NUMBER trailing a repeat symbol gives a maximum repetition count
- NUMBER on its own gives an exact repetition count
This EBNF dialect should hopefully be familiar to many readers.
## Unicode productions
A few productions in Rust's grammar permit Unicode codepoints outside the ASCII
range. We define these productions in terms of character properties specified
in the Unicode standard, rather than in terms of ASCII-range codepoints. The
section [Special Unicode Productions](#special-unicode-productions) lists these
productions.
## String table productions
Some rules in the grammar — notably [unary
operators](#unary-operator-expressions), [binary
operators](#binary-operator-expressions), and [keywords](#keywords) — are
given in a simplified form: as a listing of a table of unquoted, printable
whitespace-separated strings. These cases form a subset of the rules regarding
the [token](#tokens) rule, and are assumed to be the result of a
lexical-analysis phase feeding the parser, driven by a DFA, operating over the
disjunction of all such string table entries.
When such a string enclosed in double-quotes (`"`) occurs inside the grammar,
it is an implicit reference to a single member of such a string table
production. See [tokens](#tokens) for more information.
# Lexical structure
## Input format
Rust input is interpreted as a sequence of Unicode codepoints encoded in UTF-8.
Most Rust grammar rules are defined in terms of printable ASCII-range
codepoints, but a small number are defined in terms of Unicode properties or
explicit codepoint lists. [^inputformat]
[^inputformat]: Substitute definitions for the special Unicode productions are
provided to the grammar verifier, restricted to ASCII range, when verifying the
grammar in this document.
## Special Unicode Productions
The following productions in the Rust grammar are defined in terms of Unicode
properties: `ident`, `non_null`, `non_eol`, `non_single_quote` and
`non_double_quote`.
### Identifiers
The `ident` production is any nonempty Unicode string of
the following form:
- The first character is in one of the following ranges `U+0041` to `U+005A`
("A" to "Z"), `U+0061` to `U+007A` ("a" to "z"), or `U+005F` ("\_").
- The remaining characters are in the range `U+0030` to `U+0039` ("0" to "9"),
or any of the prior valid initial characters.
as long as the identifier does _not_ occur in the set of [keywords](#keywords).
### Delimiter-restricted productions
Some productions are defined by exclusion of particular Unicode characters:
- `non_null` is any single Unicode character aside from `U+0000` (null)
- `non_eol` is any single Unicode character aside from `U+000A` (`'\n'`)
- `non_single_quote` is any single Unicode character aside from `U+0027` (`'`)
- `non_double_quote` is any single Unicode character aside from `U+0022` (`"`)
## Comments
```antlr
comment : block_comment | line_comment ;
block_comment : "/*" block_comment_body * "*/" ;
block_comment_body : [block_comment | character] * ;
line_comment : "//" non_eol * ;
```
**FIXME:** add doc grammar?
## Whitespace
```antlr
whitespace_char : '\x20' | '\x09' | '\x0a' | '\x0d' ;
whitespace : [ whitespace_char | comment ] + ;
```
## Tokens
```antlr
simple_token : keyword | unop | binop ;
token : simple_token | ident | literal | symbol | whitespace token ;
```
### Keywords
<p id="keyword-table-marker"></p>
| | | | | |
|----------|----------|----------|----------|----------|
| _ | abstract | alignof | as | become |
| box | break | const | continue | crate |
| do | else | enum | extern | false |
| final | fn | for | if | impl |
| in | let | loop | macro | match |
| mod | move | mut | offsetof | override |
| priv | proc | pub | pure | ref |
| return | Self | self | sizeof | static |
| struct | super | trait | true | type |
| typeof | unsafe | unsized | use | virtual |
| where | while | yield | | |
Each of these keywords has special meaning in its grammar, and all of them are
excluded from the `ident` rule.
Not all of these keywords are used by the language. Some of them were used
before Rust 1.0, and were left reserved once their implementations were
removed. Some of them were reserved before 1.0 to make space for possible
future features.
### Literals
```antlr
lit_suffix : ident;
literal : [ string_lit | char_lit | byte_string_lit | byte_lit | num_lit | bool_lit ] lit_suffix ?;
```
The optional `lit_suffix` production is only used for certain numeric literals,
but is reserved for future extension. That is, the above gives the lexical
grammar, but a Rust parser will reject everything but the 12 special cases
mentioned in [Number literals](reference/tokens.html#number-literals) in the
reference.
#### Character and string literals
```antlr
char_lit : '\x27' char_body '\x27' ;
string_lit : '"' string_body * '"' | 'r' raw_string ;
char_body : non_single_quote
| '\x5c' [ '\x27' | common_escape | unicode_escape ] ;
string_body : non_double_quote
| '\x5c' [ '\x22' | common_escape | unicode_escape ] ;
raw_string : '"' raw_string_body '"' | '#' raw_string '#' ;
common_escape : '\x5c'
| 'n' | 'r' | 't' | '0'
| 'x' hex_digit 2
unicode_escape : 'u' '{' hex_digit+ 6 '}';
hex_digit : 'a' | 'b' | 'c' | 'd' | 'e' | 'f'
| 'A' | 'B' | 'C' | 'D' | 'E' | 'F'
| dec_digit ;
oct_digit : '0' | '1' | '2' | '3' | '4' | '5' | '6' | '7' ;
dec_digit : '0' | nonzero_dec ;
nonzero_dec: '1' | '2' | '3' | '4'
| '5' | '6' | '7' | '8' | '9' ;
```
#### Byte and byte string literals
```antlr
byte_lit : "b\x27" byte_body '\x27' ;
byte_string_lit : "b\x22" string_body * '\x22' | "br" raw_byte_string ;
byte_body : ascii_non_single_quote
| '\x5c' [ '\x27' | common_escape ] ;
byte_string_body : ascii_non_double_quote
| '\x5c' [ '\x22' | common_escape ] ;
raw_byte_string : '"' raw_byte_string_body '"' | '#' raw_byte_string '#' ;
```
#### Number literals
```antlr
num_lit : nonzero_dec [ dec_digit | '_' ] * float_suffix ?
| '0' [ [ dec_digit | '_' ] * float_suffix ?
| 'b' [ '1' | '0' | '_' ] +
| 'o' [ oct_digit | '_' ] +
| 'x' [ hex_digit | '_' ] + ] ;
float_suffix : [ exponent | '.' dec_lit exponent ? ] ? ;
exponent : ['E' | 'e'] ['-' | '+' ] ? dec_lit ;
dec_lit : [ dec_digit | '_' ] + ;
```
#### Boolean literals
```antlr
bool_lit : [ "true" | "false" ] ;
```
The two values of the boolean type are written `true` and `false`.
### Symbols
```antlr
symbol : "::" | "->"
| '#' | '[' | ']' | '(' | ')' | '{' | '}'
| ',' | ';' ;
```
Symbols are a general class of printable [tokens](#tokens) that play structural
roles in a variety of grammar productions. They are cataloged here for
completeness as the set of remaining miscellaneous printable tokens that do not
otherwise appear as [unary operators](#unary-operator-expressions), [binary
operators](#binary-operator-expressions), or [keywords](#keywords).
## Paths
```antlr
expr_path : [ "::" ] ident [ "::" expr_path_tail ] + ;
expr_path_tail : '<' type_expr [ ',' type_expr ] + '>'
| expr_path ;
type_path : ident [ type_path_tail ] + ;
type_path_tail : '<' type_expr [ ',' type_expr ] + '>'
| "::" type_path ;
```
# Syntax extensions
## Macros
```antlr
expr_macro_rules : "macro_rules" '!' ident '(' macro_rule * ')' ';'
| "macro_rules" '!' ident '{' macro_rule * '}' ;
macro_rule : '(' matcher * ')' "=>" '(' transcriber * ')' ';' ;
matcher : '(' matcher * ')' | '[' matcher * ']'
| '{' matcher * '}' | '$' ident ':' ident
| '$' '(' matcher * ')' sep_token? [ '*' | '+' ]
| non_special_token ;
transcriber : '(' transcriber * ')' | '[' transcriber * ']'
| '{' transcriber * '}' | '$' ident
| '$' '(' transcriber * ')' sep_token? [ '*' | '+' ]
| non_special_token ;
```
# Crates and source files
**FIXME:** grammar? What production covers #![crate_id = "foo"] ?
# Items and attributes
**FIXME:** grammar?
## Items
```antlr
item : vis ? mod_item | fn_item | type_item | struct_item | enum_item
| const_item | static_item | trait_item | impl_item | extern_block_item ;
```
### Type Parameters
**FIXME:** grammar?
### Modules
```antlr
mod_item : "mod" ident ( ';' | '{' mod '}' );
mod : [ view_item | item ] * ;
```
#### View items
```antlr
view_item : extern_crate_decl | use_decl ';' ;
```
##### Extern crate declarations
```antlr
extern_crate_decl : "extern" "crate" crate_name
crate_name: ident | ( ident "as" ident )
```
##### Use declarations
```antlr
use_decl : vis ? "use" [ path "as" ident
| path_glob ] ;
path_glob : ident [ "::" [ path_glob
| '*' ] ] ?
| '{' path_item [ ',' path_item ] * '}' ;
path_item : ident | "self" ;
```
### Functions
**FIXME:** grammar?
#### Generic functions
**FIXME:** grammar?
#### Unsafety
**FIXME:** grammar?
##### Unsafe functions
**FIXME:** grammar?
##### Unsafe blocks
**FIXME:** grammar?
#### Diverging functions
**FIXME:** grammar?
### Type definitions
**FIXME:** grammar?
### Structures
**FIXME:** grammar?
### Enumerations
**FIXME:** grammar?
### Constant items
```antlr
const_item : "const" ident ':' type '=' expr ';' ;
```
### Static items
```antlr
static_item : "static" ident ':' type '=' expr ';' ;
```
#### Mutable statics
**FIXME:** grammar?
### Traits
**FIXME:** grammar?
### Implementations
**FIXME:** grammar?
### External blocks
```antlr
extern_block_item : "extern" '{' extern_block '}' ;
extern_block : [ foreign_fn ] * ;
```
## Visibility and Privacy
```antlr
vis : "pub" ;
```
### Re-exporting and Visibility
See [Use declarations](#use-declarations).
## Attributes
```antlr
attribute : '#' '!' ? '[' meta_item ']' ;
meta_item : ident [ '=' literal
| '(' meta_seq ')' ] ? ;
meta_seq : meta_item [ ',' meta_seq ] ? ;
```
# Statements and expressions
## Statements
```antlr
stmt : decl_stmt | expr_stmt | ';' ;
```
### Declaration statements
```antlr
decl_stmt : item | let_decl ;
```
#### Item declarations
See [Items](#items).
#### Variable declarations
```antlr
let_decl : "let" pat [':' type ] ? [ init ] ? ';' ;
init : [ '=' ] expr ;
```
### Expression statements
```antlr
expr_stmt : expr ';' ;
```
## Expressions
```antlr
expr : literal | path | tuple_expr | unit_expr | struct_expr
| block_expr | method_call_expr | field_expr | array_expr
| idx_expr | range_expr | unop_expr | binop_expr
| paren_expr | call_expr | lambda_expr | while_expr
| loop_expr | break_expr | continue_expr | for_expr
| if_expr | match_expr | if_let_expr | while_let_expr
| return_expr ;
```
#### Lvalues, rvalues and temporaries
**FIXME:** grammar?
#### Moved and copied types
**FIXME:** Do we want to capture this in the grammar as different productions?
### Literal expressions
See [Literals](#literals).
### Path expressions
See [Paths](#paths).
### Tuple expressions
```antlr
tuple_expr : '(' [ expr [ ',' expr ] * | expr ',' ] ? ')' ;
```
### Unit expressions
```antlr
unit_expr : "()" ;
```
### Structure expressions
```antlr
struct_expr_field_init : ident | ident ':' expr ;
struct_expr : expr_path '{' struct_expr_field_init
[ ',' struct_expr_field_init ] *
[ ".." expr ] '}' |
expr_path '(' expr
[ ',' expr ] * ')' |
expr_path ;
```
### Block expressions
```antlr
block_expr : '{' [ stmt | item ] *
[ expr ] '}' ;
```
### Method-call expressions
```antlr
method_call_expr : expr '.' ident paren_expr_list ;
```
### Field expressions
```antlr
field_expr : expr '.' ident ;
```
### Array expressions
```antlr
array_expr : '[' "mut" ? array_elems? ']' ;
array_elems : [expr [',' expr]*] | [expr ';' expr] ;
```
### Index expressions
```antlr
idx_expr : expr '[' expr ']' ;
```
### Range expressions
```antlr
range_expr : expr ".." expr |
expr ".." |
".." expr |
".." ;
```
### Unary operator expressions
```antlr
unop_expr : unop expr ;
unop : '-' | '*' | '!' ;
```
### Binary operator expressions
```antlr
binop_expr : expr binop expr | type_cast_expr
| assignment_expr | compound_assignment_expr ;
binop : arith_op | bitwise_op | lazy_bool_op | comp_op
```
#### Arithmetic operators
```antlr
arith_op : '+' | '-' | '*' | '/' | '%' ;
```
#### Bitwise operators
```antlr
bitwise_op : '&' | '|' | '^' | "<<" | ">>" ;
```
#### Lazy boolean operators
```antlr
lazy_bool_op : "&&" | "||" ;
```
#### Comparison operators
```antlr
comp_op : "==" | "!=" | '<' | '>' | "<=" | ">=" ;
```
#### Type cast expressions
```antlr
type_cast_expr : value "as" type ;
```
#### Assignment expressions
```antlr
assignment_expr : expr '=' expr ;
```
#### Compound assignment expressions
```antlr
compound_assignment_expr : expr [ arith_op | bitwise_op ] '=' expr ;
```
### Grouped expressions
```antlr
paren_expr : '(' expr ')' ;
```
### Call expressions
```antlr
expr_list : [ expr [ ',' expr ]* ] ? ;
paren_expr_list : '(' expr_list ')' ;
call_expr : expr paren_expr_list ;
```
### Lambda expressions
```antlr
ident_list : [ ident [ ',' ident ]* ] ? ;
lambda_expr : '|' ident_list '|' expr ;
```
### While loops
```antlr
while_expr : [ lifetime ':' ] ? "while" no_struct_literal_expr '{' block '}' ;
```
### Infinite loops
```antlr
loop_expr : [ lifetime ':' ] ? "loop" '{' block '}';
```
### Break expressions
```antlr
break_expr : "break" [ lifetime ] ?;
```
### Continue expressions
```antlr
continue_expr : "continue" [ lifetime ] ?;
```
### For expressions
```antlr
for_expr : [ lifetime ':' ] ? "for" pat "in" no_struct_literal_expr '{' block '}' ;
```
### If expressions
```antlr
if_expr : "if" no_struct_literal_expr '{' block '}'
else_tail ? ;
else_tail : "else" [ if_expr | if_let_expr
| '{' block '}' ] ;
```
### Match expressions
```antlr
match_expr : "match" no_struct_literal_expr '{' match_arm * '}' ;
match_arm : attribute * match_pat "=>" [ expr "," | '{' block '}' ] ;
match_pat : pat [ '|' pat ] * [ "if" expr ] ? ;
```
### If let expressions
```antlr
if_let_expr : "if" "let" pat '=' expr '{' block '}'
else_tail ? ;
```
### While let loops
```antlr
while_let_expr : [ lifetime ':' ] ? "while" "let" pat '=' expr '{' block '}' ;
```
### Return expressions
```antlr
return_expr : "return" expr ? ;
```
# Type system
**FIXME:** is this entire chapter relevant here? Or should it all have been covered by some production already?
## Types
### Primitive types
**FIXME:** grammar?
#### Machine types
**FIXME:** grammar?
#### Machine-dependent integer types
**FIXME:** grammar?
### Textual types
**FIXME:** grammar?
### Tuple types
**FIXME:** grammar?
### Array, and Slice types
**FIXME:** grammar?
### Structure types
**FIXME:** grammar?
### Enumerated types
**FIXME:** grammar?
### Pointer types
**FIXME:** grammar?
### Function types
**FIXME:** grammar?
### Closure types
```antlr
closure_type := [ 'unsafe' ] [ '<' lifetime-list '>' ] '|' arg-list '|'
[ ':' bound-list ] [ '->' type ]
lifetime-list := lifetime | lifetime ',' lifetime-list
arg-list := ident ':' type | ident ':' type ',' arg-list
```
### Never type
An empty type
```antlr
never_type : "!" ;
```
### Object types
**FIXME:** grammar?
### Type parameters
**FIXME:** grammar?
### Type parameter bounds
```antlr
bound-list := bound | bound '+' bound-list '+' ?
bound := ty_bound | lt_bound
lt_bound := lifetime
ty_bound := ty_bound_noparen | (ty_bound_noparen)
ty_bound_noparen := [?] [ for<lt_param_defs> ] simple_path
```
### Self types
**FIXME:** grammar?
## Type kinds
**FIXME:** this is probably not relevant to the grammar...
# Memory and concurrency models
**FIXME:** is this entire chapter relevant here? Or should it all have been covered by some production already?
## Memory model
### Memory allocation and lifetime
### Memory ownership
### Variables
### Boxes
## Threads
### Communication between threads
### Thread lifecycle
[reference]: https://doc.rust-lang.org/reference/
[grammar working group]: https://github.com/rust-lang/wg-grammar

View File

@ -1,3 +0,0 @@
*.class
*.java
*.tokens

View File

@ -1,350 +0,0 @@
%{
#include <stdio.h>
#include <ctype.h>
static int num_hashes;
static int end_hashes;
static int saw_non_hash;
%}
%option stack
%option yylineno
%x str
%x rawstr
%x rawstr_esc_begin
%x rawstr_esc_body
%x rawstr_esc_end
%x byte
%x bytestr
%x rawbytestr
%x rawbytestr_nohash
%x pound
%x shebang_or_attr
%x ltorchar
%x linecomment
%x doc_line
%x blockcomment
%x doc_block
%x suffix
ident [a-zA-Z\x80-\xff_][a-zA-Z0-9\x80-\xff_]*
%%
<suffix>{ident} { BEGIN(INITIAL); }
<suffix>(.|\n) { yyless(0); BEGIN(INITIAL); }
[ \n\t\r] { }
\xef\xbb\xbf {
// UTF-8 byte order mark (BOM), ignore if in line 1, error otherwise
if (yyget_lineno() != 1) {
return -1;
}
}
\/\/(\/|\!) { BEGIN(doc_line); yymore(); }
<doc_line>\n { BEGIN(INITIAL);
yyleng--;
yytext[yyleng] = 0;
return ((yytext[2] == '!') ? INNER_DOC_COMMENT : OUTER_DOC_COMMENT);
}
<doc_line>[^\n]* { yymore(); }
\/\/|\/\/\/\/ { BEGIN(linecomment); }
<linecomment>\n { BEGIN(INITIAL); }
<linecomment>[^\n]* { }
\/\*(\*|\!)[^*] { yy_push_state(INITIAL); yy_push_state(doc_block); yymore(); }
<doc_block>\/\* { yy_push_state(doc_block); yymore(); }
<doc_block>\*\/ {
yy_pop_state();
if (yy_top_state() == doc_block) {
yymore();
} else {
return ((yytext[2] == '!') ? INNER_DOC_COMMENT : OUTER_DOC_COMMENT);
}
}
<doc_block>(.|\n) { yymore(); }
\/\* { yy_push_state(blockcomment); }
<blockcomment>\/\* { yy_push_state(blockcomment); }
<blockcomment>\*\/ { yy_pop_state(); }
<blockcomment>(.|\n) { }
_ { return UNDERSCORE; }
abstract { return ABSTRACT; }
alignof { return ALIGNOF; }
as { return AS; }
become { return BECOME; }
box { return BOX; }
break { return BREAK; }
catch { return CATCH; }
const { return CONST; }
continue { return CONTINUE; }
crate { return CRATE; }
default { return DEFAULT; }
do { return DO; }
else { return ELSE; }
enum { return ENUM; }
extern { return EXTERN; }
false { return FALSE; }
final { return FINAL; }
fn { return FN; }
for { return FOR; }
if { return IF; }
impl { return IMPL; }
in { return IN; }
let { return LET; }
loop { return LOOP; }
macro { return MACRO; }
match { return MATCH; }
mod { return MOD; }
move { return MOVE; }
mut { return MUT; }
offsetof { return OFFSETOF; }
override { return OVERRIDE; }
priv { return PRIV; }
proc { return PROC; }
pure { return PURE; }
pub { return PUB; }
ref { return REF; }
return { return RETURN; }
self { return SELF; }
sizeof { return SIZEOF; }
static { return STATIC; }
struct { return STRUCT; }
super { return SUPER; }
trait { return TRAIT; }
true { return TRUE; }
type { return TYPE; }
typeof { return TYPEOF; }
union { return UNION; }
unsafe { return UNSAFE; }
unsized { return UNSIZED; }
use { return USE; }
virtual { return VIRTUAL; }
where { return WHERE; }
while { return WHILE; }
yield { return YIELD; }
{ident} { return IDENT; }
0x[0-9a-fA-F_]+ { BEGIN(suffix); return LIT_INTEGER; }
0o[0-7_]+ { BEGIN(suffix); return LIT_INTEGER; }
0b[01_]+ { BEGIN(suffix); return LIT_INTEGER; }
[0-9][0-9_]* { BEGIN(suffix); return LIT_INTEGER; }
[0-9][0-9_]*\.(\.|[a-zA-Z]) { yyless(yyleng - 2); BEGIN(suffix); return LIT_INTEGER; }
[0-9][0-9_]*\.[0-9_]*([eE][-\+]?[0-9_]+)? { BEGIN(suffix); return LIT_FLOAT; }
[0-9][0-9_]*(\.[0-9_]*)?[eE][-\+]?[0-9_]+ { BEGIN(suffix); return LIT_FLOAT; }
; { return ';'; }
, { return ','; }
\.\.\. { return DOTDOTDOT; }
\.\. { return DOTDOT; }
\. { return '.'; }
\( { return '('; }
\) { return ')'; }
\{ { return '{'; }
\} { return '}'; }
\[ { return '['; }
\] { return ']'; }
@ { return '@'; }
# { BEGIN(pound); yymore(); }
<pound>\! { BEGIN(shebang_or_attr); yymore(); }
<shebang_or_attr>\[ {
BEGIN(INITIAL);
yyless(2);
return SHEBANG;
}
<shebang_or_attr>[^\[\n]*\n {
// Since the \n was eaten as part of the token, yylineno will have
// been incremented to the value 2 if the shebang was on the first
// line. This yyless undoes that, setting yylineno back to 1.
yyless(yyleng - 1);
if (yyget_lineno() == 1) {
BEGIN(INITIAL);
return SHEBANG_LINE;
} else {
BEGIN(INITIAL);
yyless(2);
return SHEBANG;
}
}
<pound>. { BEGIN(INITIAL); yyless(1); return '#'; }
\~ { return '~'; }
:: { return MOD_SEP; }
: { return ':'; }
\$ { return '$'; }
\? { return '?'; }
== { return EQEQ; }
=> { return FAT_ARROW; }
= { return '='; }
\!= { return NE; }
\! { return '!'; }
\<= { return LE; }
\<\< { return SHL; }
\<\<= { return SHLEQ; }
\< { return '<'; }
\>= { return GE; }
\>\> { return SHR; }
\>\>= { return SHREQ; }
\> { return '>'; }
\x27 { BEGIN(ltorchar); yymore(); }
<ltorchar>static { BEGIN(INITIAL); return STATIC_LIFETIME; }
<ltorchar>{ident} { BEGIN(INITIAL); return LIFETIME; }
<ltorchar>\\[nrt\\\x27\x220]\x27 { BEGIN(suffix); return LIT_CHAR; }
<ltorchar>\\x[0-9a-fA-F]{2}\x27 { BEGIN(suffix); return LIT_CHAR; }
<ltorchar>\\u\{([0-9a-fA-F]_*){1,6}\}\x27 { BEGIN(suffix); return LIT_CHAR; }
<ltorchar>.\x27 { BEGIN(suffix); return LIT_CHAR; }
<ltorchar>[\x80-\xff]{2,4}\x27 { BEGIN(suffix); return LIT_CHAR; }
<ltorchar><<EOF>> { BEGIN(INITIAL); return -1; }
b\x22 { BEGIN(bytestr); yymore(); }
<bytestr>\x22 { BEGIN(suffix); return LIT_BYTE_STR; }
<bytestr><<EOF>> { return -1; }
<bytestr>\\[n\nrt\\\x27\x220] { yymore(); }
<bytestr>\\x[0-9a-fA-F]{2} { yymore(); }
<bytestr>\\u\{([0-9a-fA-F]_*){1,6}\} { yymore(); }
<bytestr>\\[^n\nrt\\\x27\x220] { return -1; }
<bytestr>(.|\n) { yymore(); }
br\x22 { BEGIN(rawbytestr_nohash); yymore(); }
<rawbytestr_nohash>\x22 { BEGIN(suffix); return LIT_BYTE_STR_RAW; }
<rawbytestr_nohash>(.|\n) { yymore(); }
<rawbytestr_nohash><<EOF>> { return -1; }
br/# {
BEGIN(rawbytestr);
yymore();
num_hashes = 0;
saw_non_hash = 0;
end_hashes = 0;
}
<rawbytestr># {
if (!saw_non_hash) {
num_hashes++;
} else if (end_hashes != 0) {
end_hashes++;
if (end_hashes == num_hashes) {
BEGIN(INITIAL);
return LIT_BYTE_STR_RAW;
}
}
yymore();
}
<rawbytestr>\x22# {
end_hashes = 1;
if (end_hashes == num_hashes) {
BEGIN(INITIAL);
return LIT_BYTE_STR_RAW;
}
yymore();
}
<rawbytestr>(.|\n) {
if (!saw_non_hash) {
saw_non_hash = 1;
}
if (end_hashes != 0) {
end_hashes = 0;
}
yymore();
}
<rawbytestr><<EOF>> { return -1; }
b\x27 { BEGIN(byte); yymore(); }
<byte>\\[nrt\\\x27\x220]\x27 { BEGIN(INITIAL); return LIT_BYTE; }
<byte>\\x[0-9a-fA-F]{2}\x27 { BEGIN(INITIAL); return LIT_BYTE; }
<byte>\\u([0-9a-fA-F]_*){4}\x27 { BEGIN(INITIAL); return LIT_BYTE; }
<byte>\\U([0-9a-fA-F]_*){8}\x27 { BEGIN(INITIAL); return LIT_BYTE; }
<byte>.\x27 { BEGIN(INITIAL); return LIT_BYTE; }
<byte><<EOF>> { BEGIN(INITIAL); return -1; }
r\x22 { BEGIN(rawstr); yymore(); }
<rawstr>\x22 { BEGIN(suffix); return LIT_STR_RAW; }
<rawstr>(.|\n) { yymore(); }
<rawstr><<EOF>> { return -1; }
r/# {
BEGIN(rawstr_esc_begin);
yymore();
num_hashes = 0;
saw_non_hash = 0;
end_hashes = 0;
}
<rawstr_esc_begin># {
num_hashes++;
yymore();
}
<rawstr_esc_begin>\x22 {
BEGIN(rawstr_esc_body);
yymore();
}
<rawstr_esc_begin>(.|\n) { return -1; }
<rawstr_esc_body>\x22/# {
BEGIN(rawstr_esc_end);
yymore();
}
<rawstr_esc_body>(.|\n) {
yymore();
}
<rawstr_esc_end># {
end_hashes++;
if (end_hashes == num_hashes) {
BEGIN(INITIAL);
return LIT_STR_RAW;
}
yymore();
}
<rawstr_esc_end>[^#] {
end_hashes = 0;
BEGIN(rawstr_esc_body);
yymore();
}
<rawstr_esc_begin,rawstr_esc_body,rawstr_esc_end><<EOF>> { return -1; }
\x22 { BEGIN(str); yymore(); }
<str>\x22 { BEGIN(suffix); return LIT_STR; }
<str><<EOF>> { return -1; }
<str>\\[n\nr\rt\\\x27\x220] { yymore(); }
<str>\\x[0-9a-fA-F]{2} { yymore(); }
<str>\\u\{([0-9a-fA-F]_*){1,6}\} { yymore(); }
<str>\\[^n\nrt\\\x27\x220] { return -1; }
<str>(.|\n) { yymore(); }
\<- { return LARROW; }
-\> { return RARROW; }
- { return '-'; }
-= { return MINUSEQ; }
&& { return ANDAND; }
& { return '&'; }
&= { return ANDEQ; }
\|\| { return OROR; }
\| { return '|'; }
\|= { return OREQ; }
\+ { return '+'; }
\+= { return PLUSEQ; }
\* { return '*'; }
\*= { return STAREQ; }
\/ { return '/'; }
\/= { return SLASHEQ; }
\^ { return '^'; }
\^= { return CARETEQ; }
% { return '%'; }
%= { return PERCENTEQ; }
<<EOF>> { return 0; }
%%

View File

@ -1,193 +0,0 @@
#include <stdio.h>
#include <stdarg.h>
#include <stdlib.h>
#include <string.h>
extern int yylex();
extern int rsparse();
#define PUSHBACK_LEN 4
static char pushback[PUSHBACK_LEN];
static int verbose;
void print(const char* format, ...) {
va_list args;
va_start(args, format);
if (verbose) {
vprintf(format, args);
}
va_end(args);
}
// If there is a non-null char at the head of the pushback queue,
// dequeue it and shift the rest of the queue forwards. Otherwise,
// return the token from calling yylex.
int rslex() {
if (pushback[0] == '\0') {
return yylex();
} else {
char c = pushback[0];
memmove(pushback, pushback + 1, PUSHBACK_LEN - 1);
pushback[PUSHBACK_LEN - 1] = '\0';
return c;
}
}
// Note: this does nothing if the pushback queue is full. As long as
// there aren't more than PUSHBACK_LEN consecutive calls to push_back
// in an action, this shouldn't be a problem.
void push_back(char c) {
for (int i = 0; i < PUSHBACK_LEN; ++i) {
if (pushback[i] == '\0') {
pushback[i] = c;
break;
}
}
}
extern int rsdebug;
struct node {
struct node *next;
struct node *prev;
int own_string;
char const *name;
int n_elems;
struct node *elems[];
};
struct node *nodes = NULL;
int n_nodes;
struct node *mk_node(char const *name, int n, ...) {
va_list ap;
int i = 0;
unsigned sz = sizeof(struct node) + (n * sizeof(struct node *));
struct node *nn, *nd = (struct node *)malloc(sz);
print("# New %d-ary node: %s = %p\n", n, name, nd);
nd->own_string = 0;
nd->prev = NULL;
nd->next = nodes;
if (nodes) {
nodes->prev = nd;
}
nodes = nd;
nd->name = name;
nd->n_elems = n;
va_start(ap, n);
while (i < n) {
nn = va_arg(ap, struct node *);
print("# arg[%d]: %p\n", i, nn);
print("# (%s ...)\n", nn->name);
nd->elems[i++] = nn;
}
va_end(ap);
n_nodes++;
return nd;
}
struct node *mk_atom(char *name) {
struct node *nd = mk_node((char const *)strdup(name), 0);
nd->own_string = 1;
return nd;
}
struct node *mk_none() {
return mk_atom("<none>");
}
struct node *ext_node(struct node *nd, int n, ...) {
va_list ap;
int i = 0, c = nd->n_elems + n;
unsigned sz = sizeof(struct node) + (c * sizeof(struct node *));
struct node *nn;
print("# Extending %d-ary node by %d nodes: %s = %p",
nd->n_elems, c, nd->name, nd);
if (nd->next) {
nd->next->prev = nd->prev;
}
if (nd->prev) {
nd->prev->next = nd->next;
}
nd = realloc(nd, sz);
nd->prev = NULL;
nd->next = nodes;
nodes->prev = nd;
nodes = nd;
print(" ==> %p\n", nd);
va_start(ap, n);
while (i < n) {
nn = va_arg(ap, struct node *);
print("# arg[%d]: %p\n", i, nn);
print("# (%s ...)\n", nn->name);
nd->elems[nd->n_elems++] = nn;
++i;
}
va_end(ap);
return nd;
}
int const indent_step = 4;
void print_indent(int depth) {
while (depth) {
if (depth-- % indent_step == 0) {
print("|");
} else {
print(" ");
}
}
}
void print_node(struct node *n, int depth) {
int i = 0;
print_indent(depth);
if (n->n_elems == 0) {
print("%s\n", n->name);
} else {
print("(%s\n", n->name);
for (i = 0; i < n->n_elems; ++i) {
print_node(n->elems[i], depth + indent_step);
}
print_indent(depth);
print(")\n");
}
}
int main(int argc, char **argv) {
if (argc == 2 && strcmp(argv[1], "-v") == 0) {
verbose = 1;
} else {
verbose = 0;
}
int ret = 0;
struct node *tmp;
memset(pushback, '\0', PUSHBACK_LEN);
ret = rsparse();
print("--- PARSE COMPLETE: ret:%d, n_nodes:%d ---\n", ret, n_nodes);
if (nodes) {
print_node(nodes, 0);
}
while (nodes) {
tmp = nodes;
nodes = tmp->next;
if (tmp->own_string) {
free((void*)tmp->name);
}
free(tmp);
}
return ret;
}
void rserror(char const *s) {
fprintf(stderr, "%s\n", s);
}

File diff suppressed because it is too large Load Diff

View File

@ -1,64 +0,0 @@
Rust's lexical grammar is not context-free. Raw string literals are the source
of the problem. Informally, a raw string literal is an `r`, followed by `N`
hashes (where N can be zero), a quote, any characters, then a quote followed
by `N` hashes. Critically, once inside the first pair of quotes,
another quote cannot be followed by `N` consecutive hashes. e.g.
`r###""###"###` is invalid.
This grammar describes this as best possible:
R -> 'r' S
S -> '"' B '"'
S -> '#' S '#'
B -> . B
B -> ε
Where `.` represents any character, and `ε` the empty string. Consider the
string `r#""#"#`. This string is not a valid raw string literal, but can be
accepted as one by the above grammar, using the derivation:
R : #""#"#
S : ""#"
S : "#
B : #
B : ε
(Where `T : U` means the rule `T` is applied, and `U` is the remainder of the
string.) The difficulty arises from the fact that it is fundamentally
context-sensitive. In particular, the context needed is the number of hashes.
To prove that Rust's string literals are not context-free, we will use
the fact that context-free languages are closed under intersection with
regular languages, and the
[pumping lemma for context-free languages](https://en.wikipedia.org/wiki/Pumping_lemma_for_context-free_languages).
Consider the regular language `R = r#+""#*"#+`. If Rust's raw string literals are
context-free, then their intersection with `R`, `R'`, should also be context-free.
Therefore, to prove that raw string literals are not context-free,
it is sufficient to prove that `R'` is not context-free.
The language `R'` is `{r#^n""#^m"#^n | m < n}`.
Assume `R'` *is* context-free. Then `R'` has some pumping length `p > 0` for which
the pumping lemma applies. Consider the following string `s` in `R'`:
`r#^p""#^{p-1}"#^p`
e.g. for `p = 2`: `s = r##""#"##`
Then `s = uvwxy` for some choice of `uvwxy` such that `vx` is non-empty,
`|vwx| < p+1`, and `uv^iwx^iy` is in `R'` for all `i >= 0`.
Neither `v` nor `x` can contain a `"` or `r`, as the number of these characters
in any string in `R'` is fixed. So `v` and `x` contain only hashes.
Consequently, of the three sequences of hashes, `v` and `x` combined
can only pump two of them.
If we ever choose the central sequence of hashes, then one of the outer sequences
will not grow when we pump, leading to an imbalance between the outer sequences.
Therefore, we must pump both outer sequences of hashes. However,
there are `p+2` characters between these two sequences of hashes, and `|vwx|` must
be less than `p+1`. Therefore we have a contradiction, and `R'` must not be
context-free.
Since `R'` is not context-free, it follows that the Rust's raw string literals
must not be context-free.

View File

@ -1,66 +0,0 @@
#!/usr/bin/env python
# ignore-tidy-linelength
import sys
import os
import subprocess
import argparse
# usage: testparser.py [-h] [-p PARSER [PARSER ...]] -s SOURCE_DIR
# Parsers should read from stdin and return exit status 0 for a
# successful parse, and nonzero for an unsuccessful parse
parser = argparse.ArgumentParser()
parser.add_argument('-p', '--parser', nargs='+')
parser.add_argument('-s', '--source-dir', nargs=1, required=True)
args = parser.parse_args(sys.argv[1:])
total = 0
ok = {}
bad = {}
for parser in args.parser:
ok[parser] = 0
bad[parser] = []
devnull = open(os.devnull, 'w')
print("\n")
for base, dirs, files in os.walk(args.source_dir[0]):
for f in filter(lambda p: p.endswith('.rs'), files):
p = os.path.join(base, f)
parse_fail = 'parse-fail' in p
if sys.version_info.major == 3:
lines = open(p, encoding='utf-8').readlines()
else:
lines = open(p).readlines()
if any('ignore-test' in line or 'ignore-lexer-test' in line for line in lines):
continue
total += 1
for parser in args.parser:
if subprocess.call(parser, stdin=open(p), stderr=subprocess.STDOUT, stdout=devnull) == 0:
if parse_fail:
bad[parser].append(p)
else:
ok[parser] += 1
else:
if parse_fail:
ok[parser] += 1
else:
bad[parser].append(p)
parser_stats = ', '.join(['{}: {}'.format(parser, ok[parser]) for parser in args.parser])
sys.stdout.write("\033[K\r total: {}, {}, scanned {}"
.format(total, os.path.relpath(parser_stats), os.path.relpath(p)))
devnull.close()
print("\n")
for parser in args.parser:
filename = os.path.basename(parser) + '.bad'
print("writing {} files that did not yield the correct result with {} to {}".format(len(bad[parser]), parser, filename))
with open(filename, "w") as f:
for p in bad[parser]:
f.write(p)
f.write("\n")

View File

@ -1,99 +0,0 @@
enum Token {
SHL = 257, // Parser generators reserve 0-256 for char literals
SHR,
LE,
EQEQ,
NE,
GE,
ANDAND,
OROR,
SHLEQ,
SHREQ,
MINUSEQ,
ANDEQ,
OREQ,
PLUSEQ,
STAREQ,
SLASHEQ,
CARETEQ,
PERCENTEQ,
DOTDOT,
DOTDOTDOT,
MOD_SEP,
LARROW,
RARROW,
FAT_ARROW,
LIT_BYTE,
LIT_CHAR,
LIT_INTEGER,
LIT_FLOAT,
LIT_STR,
LIT_STR_RAW,
LIT_BYTE_STR,
LIT_BYTE_STR_RAW,
IDENT,
UNDERSCORE,
LIFETIME,
// keywords
SELF,
STATIC,
ABSTRACT,
ALIGNOF,
AS,
BECOME,
BREAK,
CATCH,
CRATE,
DEFAULT,
DO,
ELSE,
ENUM,
EXTERN,
FALSE,
FINAL,
FN,
FOR,
IF,
IMPL,
IN,
LET,
LOOP,
MACRO,
MATCH,
MOD,
MOVE,
MUT,
OFFSETOF,
OVERRIDE,
PRIV,
PUB,
PURE,
REF,
RETURN,
SIZEOF,
STRUCT,
SUPER,
UNION,
TRUE,
TRAIT,
TYPE,
UNSAFE,
UNSIZED,
USE,
VIRTUAL,
WHILE,
YIELD,
CONTINUE,
PROC,
BOX,
CONST,
WHERE,
TYPEOF,
INNER_DOC_COMMENT,
OUTER_DOC_COMMENT,
SHEBANG,
SHEBANG_LINE,
STATIC_LIFETIME
};

View File

@ -410,7 +410,7 @@ impl<'a> Parser<'a> {
&self.input[start..self.input.len()]
}
/// Parses an Argument structure, or what's contained within braces inside the format string
/// Parses an `Argument` structure, or what's contained within braces inside the format string.
fn argument(&mut self) -> Argument<'a> {
let pos = self.position();
let format = self.format();
@ -464,7 +464,7 @@ impl<'a> Parser<'a> {
}
/// Parses a format specifier at the current position, returning all of the
/// relevant information in the FormatSpec struct.
/// relevant information in the `FormatSpec` struct.
fn format(&mut self) -> FormatSpec<'a> {
let mut spec = FormatSpec {
fill: None,
@ -571,7 +571,7 @@ impl<'a> Parser<'a> {
spec
}
/// Parses a Count parameter at the current position. This does not check
/// Parses a `Count` parameter at the current position. This does not check
/// for 'CountIsNextParam' because that is only used in precision, not
/// width.
fn count(&mut self, start: usize) -> (Count, Option<InnerSpan>) {

View File

@ -988,10 +988,12 @@ impl<'a> LoweringContext<'a> {
// lower attributes (we use the AST version) there is nowhere to keep
// the `HirId`s. We don't actually need HIR version of attributes anyway.
Attribute {
id: attr.id,
style: attr.style,
item: AttrItem {
path: attr.path.clone(),
tokens: self.lower_token_stream(attr.tokens.clone()),
},
id: attr.id,
style: attr.style,
is_sugared_doc: attr.is_sugared_doc,
span: attr.span,
}

View File

@ -196,6 +196,11 @@ impl<'a> HashStable<StableHashingContext<'a>> for ast::Path {
}
}
impl_stable_hash_for!(struct ::syntax::ast::AttrItem {
path,
tokens,
});
impl<'a> HashStable<StableHashingContext<'a>> for ast::Attribute {
fn hash_stable(&self, hcx: &mut StableHashingContext<'a>, hasher: &mut StableHasher) {
// Make sure that these have been filtered out.
@ -203,19 +208,15 @@ impl<'a> HashStable<StableHashingContext<'a>> for ast::Attribute {
debug_assert!(!self.is_sugared_doc);
let ast::Attribute {
ref item,
id: _,
style,
ref path,
ref tokens,
is_sugared_doc: _,
span,
} = *self;
item.hash_stable(hcx, hasher);
style.hash_stable(hcx, hasher);
path.hash_stable(hcx, hasher);
for tt in tokens.trees() {
tt.hash_stable(hcx, hasher);
}
span.hash_stable(hcx, hasher);
}
}

View File

@ -23,7 +23,7 @@ use crate::ty::relate::RelateResult;
use crate::ty::subst::{GenericArg, InternalSubsts, SubstsRef};
use crate::ty::{self, GenericParamDefKind, Ty, TyCtxt, InferConst};
use crate::ty::{FloatVid, IntVid, TyVid, ConstVid};
use crate::util::nodemap::FxHashMap;
use crate::util::nodemap::{FxHashMap, FxHashSet};
use errors::DiagnosticBuilder;
use rustc_data_structures::sync::Lrc;
@ -155,6 +155,8 @@ pub struct InferCtxt<'a, 'tcx> {
/// avoid reporting the same error twice.
pub reported_trait_errors: RefCell<FxHashMap<Span, Vec<ty::Predicate<'tcx>>>>,
pub reported_closure_mismatch: RefCell<FxHashSet<(Span, Option<Span>)>>,
/// When an error occurs, we want to avoid reporting "derived"
/// errors that are due to this original failure. Normally, we
/// handle this with the `err_count_on_creation` count, which
@ -538,6 +540,7 @@ impl<'tcx> InferCtxtBuilder<'tcx> {
selection_cache: Default::default(),
evaluation_cache: Default::default(),
reported_trait_errors: Default::default(),
reported_closure_mismatch: Default::default(),
tainted_by_errors_flag: Cell::new(false),
err_count_on_creation: tcx.sess.err_count(),
in_snapshot: Cell::new(false),

View File

@ -24,7 +24,7 @@ use crate::hir::def_id::DefId;
use crate::infer::{self, InferCtxt};
use crate::infer::type_variable::{TypeVariableOrigin, TypeVariableOriginKind};
use crate::session::DiagnosticMessageId;
use crate::ty::{self, AdtKind, ToPredicate, ToPolyTraitRef, Ty, TyCtxt, TypeFoldable};
use crate::ty::{self, AdtKind, DefIdTree, ToPredicate, ToPolyTraitRef, Ty, TyCtxt, TypeFoldable};
use crate::ty::GenericParamDefKind;
use crate::ty::error::ExpectedFound;
use crate::ty::fast_reject;
@ -37,7 +37,7 @@ use errors::{Applicability, DiagnosticBuilder, pluralise};
use std::fmt;
use syntax::ast;
use syntax::symbol::{sym, kw};
use syntax_pos::{DUMMY_SP, Span, ExpnKind};
use syntax_pos::{DUMMY_SP, Span, ExpnKind, MultiSpan};
impl<'a, 'tcx> InferCtxt<'a, 'tcx> {
pub fn report_fulfillment_errors(
@ -550,7 +550,8 @@ impl<'a, 'tcx> InferCtxt<'a, 'tcx> {
self.suggest_new_overflow_limit(&mut err);
}
self.note_obligation_cause(&mut err, obligation);
self.note_obligation_cause_code(&mut err, &obligation.predicate, &obligation.cause.code,
&mut vec![]);
err.emit();
self.tcx.sess.abort_if_errors();
@ -885,6 +886,14 @@ impl<'a, 'tcx> InferCtxt<'a, 'tcx> {
self.tcx.hir().span_if_local(did)
).map(|sp| self.tcx.sess.source_map().def_span(sp)); // the sp could be an fn def
if self.reported_closure_mismatch.borrow().contains(&(span, found_span)) {
// We check closures twice, with obligations flowing in different directions,
// but we want to complain about them only once.
return;
}
self.reported_closure_mismatch.borrow_mut().insert((span, found_span));
let found = match found_trait_ref.skip_binder().substs.type_at(1).kind {
ty::Tuple(ref tys) => vec![ArgKind::empty(); tys.len()],
_ => vec![ArgKind::empty()],
@ -940,7 +949,9 @@ impl<'a, 'tcx> InferCtxt<'a, 'tcx> {
bug!("overflow should be handled before the `report_selection_error` path");
}
};
self.note_obligation_cause(&mut err, obligation);
err.emit();
}
@ -1604,16 +1615,166 @@ impl<'a, 'tcx> InferCtxt<'a, 'tcx> {
})
}
fn note_obligation_cause<T>(&self,
fn note_obligation_cause(
&self,
err: &mut DiagnosticBuilder<'_>,
obligation: &Obligation<'tcx, T>)
where T: fmt::Display
{
self.note_obligation_cause_code(err,
&obligation.predicate,
&obligation.cause.code,
obligation: &PredicateObligation<'tcx>,
) {
// First, attempt to add note to this error with an async-await-specific
// message, and fall back to regular note otherwise.
if !self.note_obligation_cause_for_async_await(err, obligation) {
self.note_obligation_cause_code(err, &obligation.predicate, &obligation.cause.code,
&mut vec![]);
}
}
/// Adds an async-await specific note to the diagnostic:
///
/// ```ignore (diagnostic)
/// note: future does not implement `std::marker::Send` because this value is used across an
/// await
/// --> $DIR/issue-64130-non-send-future-diags.rs:15:5
/// |
/// LL | let g = x.lock().unwrap();
/// | - has type `std::sync::MutexGuard<'_, u32>`
/// LL | baz().await;
/// | ^^^^^^^^^^^ await occurs here, with `g` maybe used later
/// LL | }
/// | - `g` is later dropped here
/// ```
///
/// Returns `true` if an async-await specific note was added to the diagnostic.
fn note_obligation_cause_for_async_await(
&self,
err: &mut DiagnosticBuilder<'_>,
obligation: &PredicateObligation<'tcx>,
) -> bool {
debug!("note_obligation_cause_for_async_await: obligation.predicate={:?} \
obligation.cause.span={:?}", obligation.predicate, obligation.cause.span);
let source_map = self.tcx.sess.source_map();
// Look into the obligation predicate to determine the type in the generator which meant
// that the predicate was not satisifed.
let (trait_ref, target_ty) = match obligation.predicate {
ty::Predicate::Trait(trait_predicate) =>
(trait_predicate.skip_binder().trait_ref, trait_predicate.skip_binder().self_ty()),
_ => return false,
};
debug!("note_obligation_cause_for_async_await: target_ty={:?}", target_ty);
// Attempt to detect an async-await error by looking at the obligation causes, looking
// for only generators, generator witnesses, opaque types or `std::future::GenFuture` to
// be present.
//
// When a future does not implement a trait because of a captured type in one of the
// generators somewhere in the call stack, then the result is a chain of obligations.
// Given a `async fn` A that calls a `async fn` B which captures a non-send type and that
// future is passed as an argument to a function C which requires a `Send` type, then the
// chain looks something like this:
//
// - `BuiltinDerivedObligation` with a generator witness (B)
// - `BuiltinDerivedObligation` with a generator (B)
// - `BuiltinDerivedObligation` with `std::future::GenFuture` (B)
// - `BuiltinDerivedObligation` with `impl std::future::Future` (B)
// - `BuiltinDerivedObligation` with `impl std::future::Future` (B)
// - `BuiltinDerivedObligation` with a generator witness (A)
// - `BuiltinDerivedObligation` with a generator (A)
// - `BuiltinDerivedObligation` with `std::future::GenFuture` (A)
// - `BuiltinDerivedObligation` with `impl std::future::Future` (A)
// - `BuiltinDerivedObligation` with `impl std::future::Future` (A)
// - `BindingObligation` with `impl_send (Send requirement)
//
// The first obligations in the chain can be used to get the details of the type that is
// captured but the entire chain must be inspected to detect this case.
let mut generator = None;
let mut next_code = Some(&obligation.cause.code);
while let Some(code) = next_code {
debug!("note_obligation_cause_for_async_await: code={:?}", code);
match code {
ObligationCauseCode::BuiltinDerivedObligation(derived_obligation) |
ObligationCauseCode::ImplDerivedObligation(derived_obligation) => {
debug!("note_obligation_cause_for_async_await: self_ty.kind={:?}",
derived_obligation.parent_trait_ref.self_ty().kind);
match derived_obligation.parent_trait_ref.self_ty().kind {
ty::Adt(ty::AdtDef { did, .. }, ..) if
self.tcx.is_diagnostic_item(sym::gen_future, *did) => {},
ty::Generator(did, ..) => generator = generator.or(Some(did)),
ty::GeneratorWitness(_) | ty::Opaque(..) => {},
_ => return false,
}
next_code = Some(derived_obligation.parent_code.as_ref());
},
ObligationCauseCode::ItemObligation(_) | ObligationCauseCode::BindingObligation(..)
if generator.is_some() => break,
_ => return false,
}
}
let generator_did = generator.expect("can only reach this if there was a generator");
// Only continue to add a note if the generator is from an `async` function.
let parent_node = self.tcx.parent(generator_did)
.and_then(|parent_did| self.tcx.hir().get_if_local(parent_did));
debug!("note_obligation_cause_for_async_await: parent_node={:?}", parent_node);
if let Some(hir::Node::Item(hir::Item {
kind: hir::ItemKind::Fn(_, header, _, _),
..
})) = parent_node {
debug!("note_obligation_cause_for_async_await: header={:?}", header);
if header.asyncness != hir::IsAsync::Async {
return false;
}
}
let span = self.tcx.def_span(generator_did);
let tables = self.tcx.typeck_tables_of(generator_did);
debug!("note_obligation_cause_for_async_await: generator_did={:?} span={:?} ",
generator_did, span);
// Look for a type inside the generator interior that matches the target type to get
// a span.
let target_span = tables.generator_interior_types.iter()
.find(|ty::GeneratorInteriorTypeCause { ty, .. }| ty::TyS::same_type(*ty, target_ty))
.map(|ty::GeneratorInteriorTypeCause { span, scope_span, .. }|
(span, source_map.span_to_snippet(*span), scope_span));
if let Some((target_span, Ok(snippet), scope_span)) = target_span {
// Look at the last interior type to get a span for the `.await`.
let await_span = tables.generator_interior_types.iter().map(|i| i.span).last().unwrap();
let mut span = MultiSpan::from_span(await_span);
span.push_span_label(
await_span, format!("await occurs here, with `{}` maybe used later", snippet));
span.push_span_label(*target_span, format!("has type `{}`", target_ty));
// If available, use the scope span to annotate the drop location.
if let Some(scope_span) = scope_span {
span.push_span_label(
source_map.end_point(*scope_span),
format!("`{}` is later dropped here", snippet),
);
}
err.span_note(span, &format!(
"future does not implement `{}` as this value is used across an await",
trait_ref,
));
// Add a note for the item obligation that remains - normally a note pointing to the
// bound that introduced the obligation (e.g. `T: Send`).
debug!("note_obligation_cause_for_async_await: next_code={:?}", next_code);
self.note_obligation_cause_code(
err,
&obligation.predicate,
next_code.unwrap(),
&mut Vec::new(),
);
true
} else {
false
}
}
fn note_obligation_cause_code<T>(&self,
err: &mut DiagnosticBuilder<'_>,

View File

@ -288,6 +288,34 @@ pub struct ResolvedOpaqueTy<'tcx> {
pub substs: SubstsRef<'tcx>,
}
/// Whenever a value may be live across a generator yield, the type of that value winds up in the
/// `GeneratorInteriorTypeCause` struct. This struct adds additional information about such
/// captured types that can be useful for diagnostics. In particular, it stores the span that
/// caused a given type to be recorded, along with the scope that enclosed the value (which can
/// be used to find the await that the value is live across).
///
/// For example:
///
/// ```ignore (pseudo-Rust)
/// async move {
/// let x: T = ...;
/// foo.await
/// ...
/// }
/// ```
///
/// Here, we would store the type `T`, the span of the value `x`, and the "scope-span" for
/// the scope that contains `x`.
#[derive(RustcEncodable, RustcDecodable, Clone, Debug, Eq, Hash, HashStable, PartialEq)]
pub struct GeneratorInteriorTypeCause<'tcx> {
/// Type of the captured binding.
pub ty: Ty<'tcx>,
/// Span of the binding that was captured.
pub span: Span,
/// Span of the scope of the captured binding.
pub scope_span: Option<Span>,
}
#[derive(RustcEncodable, RustcDecodable, Debug)]
pub struct TypeckTables<'tcx> {
/// The HirId::owner all ItemLocalIds in this table are relative to.
@ -397,6 +425,10 @@ pub struct TypeckTables<'tcx> {
/// leading to the member of the struct or tuple that is used instead of the
/// entire variable.
pub upvar_list: ty::UpvarListMap,
/// Stores the type, span and optional scope span of all types
/// that are live across the yield of this generator (if a generator).
pub generator_interior_types: Vec<GeneratorInteriorTypeCause<'tcx>>,
}
impl<'tcx> TypeckTables<'tcx> {
@ -422,6 +454,7 @@ impl<'tcx> TypeckTables<'tcx> {
free_region_map: Default::default(),
concrete_opaque_types: Default::default(),
upvar_list: Default::default(),
generator_interior_types: Default::default(),
}
}
@ -729,6 +762,7 @@ impl<'a, 'tcx> HashStable<StableHashingContext<'a>> for TypeckTables<'tcx> {
ref free_region_map,
ref concrete_opaque_types,
ref upvar_list,
ref generator_interior_types,
} = *self;
@ -773,6 +807,7 @@ impl<'a, 'tcx> HashStable<StableHashingContext<'a>> for TypeckTables<'tcx> {
free_region_map.hash_stable(hcx, hasher);
concrete_opaque_types.hash_stable(hcx, hasher);
upvar_list.hash_stable(hcx, hasher);
generator_interior_types.hash_stable(hcx, hasher);
})
}
}

View File

@ -75,7 +75,7 @@ pub use self::binding::BindingMode;
pub use self::binding::BindingMode::*;
pub use self::context::{TyCtxt, FreeRegionInfo, AllArenas, tls, keep_local};
pub use self::context::{Lift, TypeckTables, CtxtInterners, GlobalCtxt};
pub use self::context::{Lift, GeneratorInteriorTypeCause, TypeckTables, CtxtInterners, GlobalCtxt};
pub use self::context::{
UserTypeAnnotationIndex, UserType, CanonicalUserType,
CanonicalUserTypeAnnotation, CanonicalUserTypeAnnotations, ResolvedOpaqueTy,

View File

@ -99,8 +99,8 @@ impl Margin {
// ```
let mut m = Margin {
whitespace_left: if whitespace_left >= 6 { whitespace_left - 6 } else { 0 },
span_left: if span_left >= 6 { span_left - 6 } else { 0 },
whitespace_left: whitespace_left.saturating_sub(6),
span_left: span_left.saturating_sub(6),
span_right: span_right + 6,
computed_left: 0,
computed_right: 0,
@ -125,7 +125,7 @@ impl Margin {
} else {
self.computed_right
};
right < line_len && line_len > self.computed_left + self.column_width
right < line_len && self.computed_left + self.column_width < line_len
}
fn compute(&mut self, max_line_len: usize) {
@ -167,12 +167,10 @@ impl Margin {
}
fn right(&self, line_len: usize) -> usize {
if max(line_len, self.computed_left) - self.computed_left <= self.column_width {
line_len
} else if self.computed_right > line_len {
if line_len.saturating_sub(self.computed_left) <= self.column_width {
line_len
} else {
self.computed_right
min(line_len, self.computed_right)
}
}
}
@ -297,9 +295,11 @@ pub trait Emitter {
source_map: &Option<Lrc<SourceMapperDyn>>,
span: &mut MultiSpan,
always_backtrace: bool) -> bool {
let mut spans_updated = false;
let sm = match source_map {
Some(ref sm) => sm,
None => return false,
};
if let Some(ref sm) = source_map {
let mut before_after: Vec<(Span, Span)> = vec![];
let mut new_labels: Vec<(Span, String)> = vec![];
@ -368,10 +368,9 @@ pub trait Emitter {
}
}
// After we have them, make sure we replace these 'bad' def sites with their use sites
let spans_updated = !before_after.is_empty();
for (before, after) in before_after {
span.replace(before, after);
spans_updated = true;
}
}
spans_updated
@ -593,9 +592,9 @@ impl EmitterWriter {
let left = margin.left(source_string.len()); // Left trim
// Account for unicode characters of width !=0 that were removed.
let left = source_string.chars().take(left).fold(0, |acc, ch| {
acc + unicode_width::UnicodeWidthChar::width(ch).unwrap_or(1)
});
let left = source_string.chars().take(left)
.map(|ch| unicode_width::UnicodeWidthChar::width(ch).unwrap_or(1))
.sum();
self.draw_line(
buffer,
@ -623,8 +622,7 @@ impl EmitterWriter {
// 3 | |
// 4 | | }
// | |_^ test
if line.annotations.len() == 1 {
if let Some(ref ann) = line.annotations.get(0) {
if let [ann] = &line.annotations[..] {
if let AnnotationType::MultilineStart(depth) = ann.annotation_type {
if source_string.chars().take(ann.start_col).all(|c| c.is_whitespace()) {
let style = if ann.is_primary {
@ -637,7 +635,6 @@ impl EmitterWriter {
}
}
}
}
// We want to display like this:
//
@ -763,11 +760,7 @@ impl EmitterWriter {
annotations_position.push((p, annotation));
for (j, next) in annotations.iter().enumerate() {
if j > i {
let l = if let Some(ref label) = next.label {
label.len() + 2
} else {
0
};
let l = next.label.as_ref().map_or(0, |label| label.len() + 2);
if (overlaps(next, annotation, l) // Do not allow two labels to be in the same
// line if they overlap including padding, to
// avoid situations like:
@ -797,9 +790,7 @@ impl EmitterWriter {
}
}
}
if line_len < p {
line_len = p;
}
line_len = max(line_len, p);
}
if line_len != 0 {
@ -941,17 +932,9 @@ impl EmitterWriter {
Style::LabelSecondary
};
let (pos, col) = if pos == 0 {
(pos + 1, if annotation.end_col + 1 > left {
annotation.end_col + 1 - left
(pos + 1, (annotation.end_col + 1).saturating_sub(left))
} else {
0
})
} else {
(pos + 2, if annotation.start_col > left {
annotation.start_col - left
} else {
0
})
(pos + 2, annotation.start_col.saturating_sub(left))
};
if let Some(ref label) = annotation.label {
buffer.puts(line_offset + pos, code_offset + col, &label, style);
@ -966,9 +949,9 @@ impl EmitterWriter {
// | | |
// | | something about `foo`
// | something about `fn foo()`
annotations_position.sort_by(|a, b| {
// Decreasing order. When `a` and `b` are the same length, prefer `Primary`.
(a.1.len(), !a.1.is_primary).cmp(&(b.1.len(), !b.1.is_primary)).reverse()
annotations_position.sort_by_key(|(_, ann)| {
// Decreasing order. When annotations share the same length, prefer `Primary`.
(Reverse(ann.len()), ann.is_primary)
});
// Write the underlines.
@ -991,11 +974,7 @@ impl EmitterWriter {
for p in annotation.start_col..annotation.end_col {
buffer.putc(
line_offset + 1,
if code_offset + p > left {
code_offset + p - left
} else {
0
},
(code_offset + p).saturating_sub(left),
underline,
style,
);
@ -1018,40 +997,36 @@ impl EmitterWriter {
}
fn get_multispan_max_line_num(&mut self, msp: &MultiSpan) -> usize {
let sm = match self.sm {
Some(ref sm) => sm,
None => return 0,
};
let mut max = 0;
if let Some(ref sm) = self.sm {
for primary_span in msp.primary_spans() {
if !primary_span.is_dummy() {
let hi = sm.lookup_char_pos(primary_span.hi());
if hi.line > max {
max = hi.line;
}
max = (hi.line).max(max);
}
}
if !self.short_message {
for span_label in msp.span_labels() {
if !span_label.span.is_dummy() {
let hi = sm.lookup_char_pos(span_label.span.hi());
if hi.line > max {
max = hi.line;
}
}
max = (hi.line).max(max);
}
}
}
max
}
fn get_max_line_num(&mut self, span: &MultiSpan, children: &[SubDiagnostic]) -> usize {
let primary = self.get_multispan_max_line_num(span);
let mut max = primary;
for sub in children {
let sub_result = self.get_multispan_max_line_num(&sub.span);
max = std::cmp::max(sub_result, max);
}
max
children.iter()
.map(|sub| self.get_multispan_max_line_num(&sub.span))
.max()
.unwrap_or(primary)
}
/// Adds a left margin to every line but the first, given a padding length and the label being
@ -1081,15 +1056,13 @@ impl EmitterWriter {
// `max_line_num_len`
let padding = " ".repeat(padding + label.len() + 5);
/// Returns `true` if `style`, or the override if present and the style is `NoStyle`.
fn style_or_override(style: Style, override_style: Option<Style>) -> Style {
if let Some(o) = override_style {
if style == Style::NoStyle {
return o;
/// Returns `override` if it is present and `style` is `NoStyle` or `style` otherwise
fn style_or_override(style: Style, override_: Option<Style>) -> Style {
match (style, override_) {
(Style::NoStyle, Some(override_)) => override_,
_ => style,
}
}
style
}
let mut line_number = 0;
@ -1324,13 +1297,12 @@ impl EmitterWriter {
for line in &annotated_file.lines {
max_line_len = max(max_line_len, annotated_file.file
.get_line(line.line_index - 1)
.map(|s| s.len())
.unwrap_or(0));
.map_or(0, |s| s.len()));
for ann in &line.annotations {
span_right_margin = max(span_right_margin, ann.start_col);
span_right_margin = max(span_right_margin, ann.end_col);
// FIXME: account for labels not in the same line
let label_right = ann.label.as_ref().map(|l| l.len() + 1).unwrap_or(0);
let label_right = ann.label.as_ref().map_or(0, |l| l.len() + 1);
label_right_margin = max(label_right_margin, ann.end_col + label_right);
}
}
@ -1459,7 +1431,11 @@ impl EmitterWriter {
level: &Level,
max_line_num_len: usize,
) -> io::Result<()> {
if let Some(ref sm) = self.sm {
let sm = match self.sm {
Some(ref sm) => sm,
None => return Ok(())
};
let mut buffer = StyledBuffer::new();
// Render the suggestion message
@ -1525,9 +1501,9 @@ impl EmitterWriter {
.saturating_sub(part.snippet.trim_start().len());
// ...or trailing spaces. Account for substitutions containing unicode
// characters.
let sub_len = part.snippet.trim().chars().fold(0, |acc, ch| {
acc + unicode_width::UnicodeWidthChar::width(ch).unwrap_or(1)
});
let sub_len: usize = part.snippet.trim().chars()
.map(|ch| unicode_width::UnicodeWidthChar::width(ch).unwrap_or(1))
.sum();
let underline_start = (span_start_pos + start) as isize + offset;
let underline_end = (span_start_pos + start + sub_len) as isize + offset;
@ -1548,9 +1524,9 @@ impl EmitterWriter {
}
// length of the code after substitution
let full_sub_len = part.snippet.chars().fold(0, |acc, ch| {
acc + unicode_width::UnicodeWidthChar::width(ch).unwrap_or(1) as isize
});
let full_sub_len = part.snippet.chars()
.map(|ch| unicode_width::UnicodeWidthChar::width(ch).unwrap_or(1))
.sum::<usize>() as isize;
// length of the code to be substituted
let snippet_len = span_end_pos as isize - span_start_pos as isize;
@ -1574,7 +1550,6 @@ impl EmitterWriter {
buffer.puts(row_num, 0, &msg, Style::NoStyle);
}
emit_to_destination(&buffer.render(), level, &mut self.dst, self.short_message)?;
}
Ok(())
}
@ -1732,7 +1707,7 @@ impl FileWithAnnotatedLines {
hi.col_display += 1;
}
let ann_type = if lo.line != hi.line {
if lo.line != hi.line {
let ml = MultilineAnnotation {
depth: 1,
line_start: lo.line,
@ -1740,34 +1715,27 @@ impl FileWithAnnotatedLines {
start_col: lo.col_display,
end_col: hi.col_display,
is_primary: span_label.is_primary,
label: span_label.label.clone(),
label: span_label.label,
overlaps_exactly: false,
};
multiline_annotations.push((lo.file.clone(), ml.clone()));
AnnotationType::Multiline(ml)
multiline_annotations.push((lo.file, ml));
} else {
AnnotationType::Singleline
};
let ann = Annotation {
start_col: lo.col_display,
end_col: hi.col_display,
is_primary: span_label.is_primary,
label: span_label.label.clone(),
annotation_type: ann_type,
label: span_label.label,
annotation_type: AnnotationType::Singleline,
};
if !ann.is_multiline() {
add_annotation_to_file(&mut output, lo.file, lo.line, ann);
}
};
}
}
// Find overlapping multiline annotations, put them at different depths
multiline_annotations.sort_by_key(|&(_, ref ml)| (ml.line_start, ml.line_end));
for item in multiline_annotations.clone() {
let ann = item.1;
for item in multiline_annotations.iter_mut() {
let ref mut a = item.1;
for (_, ann) in multiline_annotations.clone() {
for (_, a) in multiline_annotations.iter_mut() {
// Move all other multiline annotations overlapping with this one
// one level to the right.
if !(ann.same_span(a)) &&
@ -1784,9 +1752,7 @@ impl FileWithAnnotatedLines {
let mut max_depth = 0; // max overlapping multiline spans
for (file, ann) in multiline_annotations {
if ann.depth > max_depth {
max_depth = ann.depth;
}
max_depth = max(max_depth, ann.depth);
let mut end_ann = ann.as_end();
if !ann.overlaps_exactly {
// avoid output like

View File

@ -247,7 +247,7 @@ pub fn register_plugins<'a>(
rustc_incremental::prepare_session_directory(sess, &crate_name, disambiguator);
if sess.opts.incremental.is_some() {
time(sess, "garbage collect incremental cache directory", || {
time(sess, "garbage-collect incremental cache directory", || {
if let Err(e) = rustc_incremental::garbage_collect_session_directories(sess) {
warn!(
"Error while trying to garbage collect incremental \
@ -318,7 +318,7 @@ fn configure_and_expand_inner<'a>(
crate_loader: &'a mut CrateLoader<'a>,
plugin_info: PluginInfo,
) -> Result<(ast::Crate, Resolver<'a>)> {
time(sess, "pre ast expansion lint checks", || {
time(sess, "pre-AST-expansion lint checks", || {
lint::check_ast_crate(
sess,
&krate,
@ -536,8 +536,8 @@ pub fn lower_to_hir(
dep_graph: &DepGraph,
krate: &ast::Crate,
) -> Result<hir::map::Forest> {
// Lower ast -> hir
let hir_forest = time(sess, "lowering ast -> hir", || {
// Lower AST to HIR.
let hir_forest = time(sess, "lowering AST -> HIR", || {
let hir_crate = lower_crate(sess, cstore, &dep_graph, &krate, resolver);
if sess.opts.debugging_opts.hir_stats {
@ -757,7 +757,7 @@ pub fn prepare_outputs(
if !only_dep_info {
if let Some(ref dir) = compiler.output_dir {
if fs::create_dir_all(dir).is_err() {
sess.err("failed to find or create the directory specified by --out-dir");
sess.err("failed to find or create the directory specified by `--out-dir`");
return Err(ErrorReported);
}
}
@ -830,8 +830,8 @@ pub fn create_global_ctxt(
let global_ctxt: Option<GlobalCtxt<'_>>;
let arenas = AllArenas::new();
// Construct the HIR map
let hir_map = time(sess, "indexing hir", || {
// Construct the HIR map.
let hir_map = time(sess, "indexing HIR", || {
hir::map::map_crate(sess, cstore, &mut hir_forest, &defs)
});
@ -942,7 +942,7 @@ fn analysis(tcx: TyCtxt<'_>, cnum: CrateNum) -> Result<()> {
tcx.par_body_owners(|def_id| tcx.ensure().mir_borrowck(def_id));
});
time(sess, "dumping chalk-like clauses", || {
time(sess, "dumping Chalk-like clauses", || {
rustc_traits::lowering::dump_program_clauses(tcx);
});

View File

@ -1,4 +1,3 @@
#![feature(proc_macro_hygiene)]
#![allow(rustc::default_hash_types)]
#![recursion_limit="128"]

View File

@ -953,7 +953,7 @@ impl<'o, 'tcx> dyn AstConv<'tcx> + 'o {
span,
"default bound relaxed for a type parameter, but \
this does nothing because the given bound is not \
a default. Only `?Sized` is supported",
a default; only `?Sized` is supported",
);
}
}

View File

@ -14,7 +14,7 @@ use crate::util::nodemap::FxHashMap;
struct InteriorVisitor<'a, 'tcx> {
fcx: &'a FnCtxt<'a, 'tcx>,
types: FxHashMap<Ty<'tcx>, usize>,
types: FxHashMap<ty::GeneratorInteriorTypeCause<'tcx>, usize>,
region_scope_tree: &'tcx region::ScopeTree,
expr_count: usize,
kind: hir::GeneratorKind,
@ -83,7 +83,12 @@ impl<'a, 'tcx> InteriorVisitor<'a, 'tcx> {
} else {
// Map the type to the number of types added before it
let entries = self.types.len();
self.types.entry(&ty).or_insert(entries);
let scope_span = scope.map(|s| s.span(self.fcx.tcx, self.region_scope_tree));
self.types.entry(ty::GeneratorInteriorTypeCause {
span: source_span,
ty: &ty,
scope_span
}).or_insert(entries);
}
} else {
debug!("no type in expr = {:?}, count = {:?}, span = {:?}",
@ -118,8 +123,12 @@ pub fn resolve_interior<'a, 'tcx>(
// Sort types by insertion order
types.sort_by_key(|t| t.1);
// Store the generator types and spans into the tables for this generator.
let interior_types = types.iter().cloned().map(|t| t.0).collect::<Vec<_>>();
visitor.fcx.inh.tables.borrow_mut().generator_interior_types = interior_types;
// Extract type components
let type_list = fcx.tcx.mk_type_list(types.into_iter().map(|t| t.0));
let type_list = fcx.tcx.mk_type_list(types.into_iter().map(|t| (t.0).ty));
// The types in the generator interior contain lifetimes local to the generator itself,
// which should not be exposed outside of the generator. Therefore, we replace these

View File

@ -631,15 +631,18 @@ impl<'a, 'tcx> FnCtxt<'a, 'tcx> {
}
}
fn suggest_valid_traits(&self,
fn suggest_valid_traits(
&self,
err: &mut DiagnosticBuilder<'_>,
valid_out_of_scope_traits: Vec<DefId>) -> bool {
valid_out_of_scope_traits: Vec<DefId>,
) -> bool {
if !valid_out_of_scope_traits.is_empty() {
let mut candidates = valid_out_of_scope_traits;
candidates.sort();
candidates.dedup();
err.help("items from traits can only be used if the trait is in scope");
let msg = format!("the following {traits_are} implemented but not in scope, \
let msg = format!(
"the following {traits_are} implemented but not in scope; \
perhaps add a `use` for {one_of_them}:",
traits_are = if candidates.len() == 1 {
"trait is"
@ -650,7 +653,8 @@ impl<'a, 'tcx> FnCtxt<'a, 'tcx> {
"it"
} else {
"one of them"
});
},
);
self.suggest_use_candidates(err, msg, candidates);
true

View File

@ -2364,7 +2364,8 @@ impl<'a, 'tcx> FnCtxt<'a, 'tcx> {
// which diverges, that we are about to lint on. This gives suboptimal diagnostics.
// Instead, stop here so that the `if`- or `while`-expression's block is linted instead.
if !span.is_desugaring(DesugaringKind::CondTemporary) &&
!span.is_desugaring(DesugaringKind::Async)
!span.is_desugaring(DesugaringKind::Async) &&
!orig_span.is_desugaring(DesugaringKind::Await)
{
self.diverges.set(Diverges::WarnedAlways);

View File

@ -58,6 +58,7 @@ impl<'a, 'tcx> FnCtxt<'a, 'tcx> {
wbcx.visit_free_region_map();
wbcx.visit_user_provided_tys();
wbcx.visit_user_provided_sigs();
wbcx.visit_generator_interior_types();
let used_trait_imports = mem::replace(
&mut self.tables.borrow_mut().used_trait_imports,
@ -430,6 +431,12 @@ impl<'cx, 'tcx> WritebackCx<'cx, 'tcx> {
}
}
fn visit_generator_interior_types(&mut self) {
let fcx_tables = self.fcx.tables.borrow();
debug_assert_eq!(fcx_tables.local_id_root, self.tables.local_id_root);
self.tables.generator_interior_types = fcx_tables.generator_interior_types.clone();
}
fn visit_opaque_types(&mut self, span: Span) {
for (&def_id, opaque_defn) in self.fcx.opaque_types.borrow().iter() {
let hir_id = self.tcx().hir().as_local_hir_id(def_id).unwrap();

View File

@ -26,6 +26,7 @@ pub fn from_generator<T: Generator<Yield = ()>>(x: T) -> impl Future<Output = T:
#[doc(hidden)]
#[unstable(feature = "gen_future", issue = "50547")]
#[derive(Copy, Clone, Debug, Eq, PartialEq, Ord, PartialOrd, Hash)]
#[cfg_attr(not(test), rustc_diagnostic_item = "gen_future")]
struct GenFuture<T: Generator<Yield = ()>>(T);
// We rely on the fact that async/await futures are immovable in order to create

View File

@ -2139,18 +2139,29 @@ impl rustc_serialize::Decodable for AttrId {
}
}
#[derive(Clone, RustcEncodable, RustcDecodable, Debug)]
pub struct AttrItem {
pub path: Path,
pub tokens: TokenStream,
}
/// Metadata associated with an item.
/// Doc-comments are promoted to attributes that have `is_sugared_doc = true`.
#[derive(Clone, RustcEncodable, RustcDecodable, Debug)]
pub struct Attribute {
pub item: AttrItem,
pub id: AttrId,
pub style: AttrStyle,
pub path: Path,
pub tokens: TokenStream,
pub is_sugared_doc: bool,
pub span: Span,
}
// Compatibility impl to avoid churn, consider removing.
impl std::ops::Deref for Attribute {
type Target = AttrItem;
fn deref(&self) -> &Self::Target { &self.item }
}
/// `TraitRef`s appear in impls.
///
/// Resolution maps each `TraitRef`'s `ref_id` to its defining trait; that's all

View File

@ -9,7 +9,7 @@ pub use StabilityLevel::*;
pub use crate::ast::Attribute;
use crate::ast;
use crate::ast::{AttrId, AttrStyle, Name, Ident, Path, PathSegment};
use crate::ast::{AttrItem, AttrId, AttrStyle, Name, Ident, Path, PathSegment};
use crate::ast::{MetaItem, MetaItemKind, NestedMetaItem};
use crate::ast::{Lit, LitKind, Expr, Item, Local, Stmt, StmtKind, GenericParam};
use crate::mut_visit::visit_clobber;
@ -255,9 +255,8 @@ impl MetaItem {
}
}
impl Attribute {
/// Extracts the `MetaItem` from inside this `Attribute`.
pub fn meta(&self) -> Option<MetaItem> {
impl AttrItem {
crate fn meta(&self, span: Span) -> Option<MetaItem> {
let mut tokens = self.tokens.trees().peekable();
Some(MetaItem {
path: self.path.clone(),
@ -269,9 +268,16 @@ impl Attribute {
} else {
return None;
},
span: self.span,
span,
})
}
}
impl Attribute {
/// Extracts the MetaItem from inside this Attribute.
pub fn meta(&self) -> Option<MetaItem> {
self.item.meta(self.span)
}
pub fn parse<'a, T, F>(&self, sess: &'a ParseSess, mut f: F) -> PResult<'a, T>
where F: FnMut(&mut Parser<'a>) -> PResult<'a, T>,
@ -333,10 +339,9 @@ impl Attribute {
DUMMY_SP,
);
f(&Attribute {
item: AttrItem { path: meta.path, tokens: meta.kind.tokens(meta.span) },
id: self.id,
style: self.style,
path: meta.path,
tokens: meta.kind.tokens(meta.span),
is_sugared_doc: true,
span: self.span,
})
@ -384,10 +389,9 @@ crate fn mk_attr_id() -> AttrId {
pub fn mk_attr(style: AttrStyle, path: Path, tokens: TokenStream, span: Span) -> Attribute {
Attribute {
item: AttrItem { path, tokens },
id: mk_attr_id(),
style,
path,
tokens,
is_sugared_doc: false,
span,
}
@ -408,10 +412,12 @@ pub fn mk_sugared_doc_attr(text: Symbol, span: Span) -> Attribute {
let lit_kind = LitKind::Str(text, ast::StrStyle::Cooked);
let lit = Lit::from_lit_kind(lit_kind, span);
Attribute {
id: mk_attr_id(),
style,
item: AttrItem {
path: Path::from_ident(Ident::with_dummy_span(sym::doc).with_span_pos(span)),
tokens: MetaItemKind::NameValue(lit).tokens(span),
},
id: mk_attr_id(),
style,
is_sugared_doc: true,
span,
}
@ -524,7 +530,7 @@ impl MetaItem {
}
Some(TokenTree::Token(Token { kind: token::Interpolated(nt), .. })) => match *nt {
token::Nonterminal::NtIdent(ident, _) => Path::from_ident(ident),
token::Nonterminal::NtMeta(ref meta) => return Some(meta.clone()),
token::Nonterminal::NtMeta(ref item) => return item.meta(item.path.span),
token::Nonterminal::NtPath(ref path) => path.clone(),
_ => return None,
},

View File

@ -122,8 +122,8 @@ impl<'a> StripUnconfigured<'a> {
while !parser.check(&token::CloseDelim(token::Paren)) {
let lo = parser.token.span.lo();
let (path, tokens) = parser.parse_meta_item_unrestricted()?;
expanded_attrs.push((path, tokens, parser.prev_span.with_lo(lo)));
let item = parser.parse_attr_item()?;
expanded_attrs.push((item, parser.prev_span.with_lo(lo)));
parser.expect_one_of(&[token::Comma], &[token::CloseDelim(token::Paren)])?;
}
@ -150,11 +150,10 @@ impl<'a> StripUnconfigured<'a> {
// `cfg_attr` inside of another `cfg_attr`. E.g.
// `#[cfg_attr(false, cfg_attr(true, some_attr))]`.
expanded_attrs.into_iter()
.flat_map(|(path, tokens, span)| self.process_cfg_attr(ast::Attribute {
.flat_map(|(item, span)| self.process_cfg_attr(ast::Attribute {
item,
id: attr::mk_attr_id(),
style: attr.style,
path,
tokens,
is_sugared_doc: false,
span,
}))

View File

@ -1,4 +1,4 @@
use crate::ast::{self, Block, Ident, LitKind, NodeId, PatKind, Path};
use crate::ast::{self, AttrItem, Block, Ident, LitKind, NodeId, PatKind, Path};
use crate::ast::{MacStmtStyle, StmtKind, ItemKind};
use crate::attr::{self, HasAttrs};
use crate::source_map::respan;
@ -555,15 +555,6 @@ impl<'a, 'b> MacroExpander<'a, 'b> {
}
fn expand_invoc(&mut self, invoc: Invocation, ext: &SyntaxExtensionKind) -> AstFragment {
let (fragment_kind, span) = (invoc.fragment_kind, invoc.span());
if fragment_kind == AstFragmentKind::ForeignItems && !self.cx.ecfg.macros_in_extern() {
if let SyntaxExtensionKind::NonMacroAttr { .. } = ext {} else {
emit_feature_err(&self.cx.parse_sess, sym::macros_in_extern,
span, GateIssue::Language,
"macro invocations in `extern {}` blocks are experimental");
}
}
if self.cx.current_expansion.depth > self.cx.ecfg.recursion_limit {
let expn_data = self.cx.current_expansion.id.expn_data();
let suggested_limit = self.cx.ecfg.recursion_limit * 2;
@ -578,6 +569,7 @@ impl<'a, 'b> MacroExpander<'a, 'b> {
FatalError.raise();
}
let (fragment_kind, span) = (invoc.fragment_kind, invoc.span());
match invoc.kind {
InvocationKind::Bang { mac, .. } => match ext {
SyntaxExtensionKind::Bang(expander) => {
@ -625,9 +617,10 @@ impl<'a, 'b> MacroExpander<'a, 'b> {
| Annotatable::Variant(..)
=> panic!("unexpected annotatable"),
})), DUMMY_SP).into();
let input = self.extract_proc_macro_attr_input(attr.tokens, span);
let input = self.extract_proc_macro_attr_input(attr.item.tokens, span);
let tok_result = expander.expand(self.cx, span, input, item_tok);
let res = self.parse_ast_fragment(tok_result, fragment_kind, &attr.path, span);
let res =
self.parse_ast_fragment(tok_result, fragment_kind, &attr.item.path, span);
self.gate_proc_macro_expansion(span, &res);
res
}
@ -757,14 +750,14 @@ impl<'a, 'b> MacroExpander<'a, 'b> {
fn gate_proc_macro_expansion_kind(&self, span: Span, kind: AstFragmentKind) {
let kind = match kind {
AstFragmentKind::Expr => "expressions",
AstFragmentKind::Expr |
AstFragmentKind::OptExpr => "expressions",
AstFragmentKind::Pat => "patterns",
AstFragmentKind::Ty => "types",
AstFragmentKind::Stmts => "statements",
AstFragmentKind::Items => return,
AstFragmentKind::TraitItems => return,
AstFragmentKind::ImplItems => return,
AstFragmentKind::Ty |
AstFragmentKind::Items |
AstFragmentKind::TraitItems |
AstFragmentKind::ImplItems |
AstFragmentKind::ForeignItems => return,
AstFragmentKind::Arms
| AstFragmentKind::Fields
@ -1530,11 +1523,10 @@ impl<'a, 'b> MutVisitor for InvocationCollector<'a, 'b> {
let meta = attr::mk_list_item(Ident::with_dummy_span(sym::doc), items);
*at = attr::Attribute {
item: AttrItem { path: meta.path, tokens: meta.kind.tokens(meta.span) },
span: at.span,
id: at.id,
style: at.style,
path: meta.path,
tokens: meta.kind.tokens(meta.span),
is_sugared_doc: false,
};
} else {
@ -1578,9 +1570,6 @@ impl<'feat> ExpansionConfig<'feat> {
}
}
fn macros_in_extern(&self) -> bool {
self.features.map_or(false, |features| features.macros_in_extern)
}
fn proc_macro_hygiene(&self) -> bool {
self.features.map_or(false, |features| features.proc_macro_hygiene)
}

View File

@ -924,7 +924,7 @@ fn parse_nt(p: &mut Parser<'_>, sp: Span, name: Symbol) -> Nonterminal {
FatalError.raise()
}
sym::path => token::NtPath(panictry!(p.parse_path(PathStyle::Type))),
sym::meta => token::NtMeta(panictry!(p.parse_meta_item())),
sym::meta => token::NtMeta(panictry!(p.parse_attr_item())),
sym::vis => token::NtVis(panictry!(p.parse_visibility(true))),
sym::lifetime => if p.check_lifetime() {
token::NtLifetime(p.expect_lifetime().ident)

View File

@ -245,6 +245,8 @@ declare_features! (
(accepted, bind_by_move_pattern_guards, "1.39.0", Some(15287), None),
/// Allows attributes in formal function parameters.
(accepted, param_attrs, "1.39.0", Some(60406), None),
// Allows macro invocations in `extern {}` blocks.
(accepted, macros_in_extern, "1.40.0", Some(49476), None),
// -------------------------------------------------------------------------
// feature-group-end: accepted features

View File

@ -402,9 +402,6 @@ declare_features! (
/// Allows infering `'static` outlives requirements (RFC 2093).
(active, infer_static_outlives_requirements, "1.26.0", Some(54185), None),
/// Allows macro invocations in `extern {}` blocks.
(active, macros_in_extern, "1.27.0", Some(49476), None),
/// Allows accessing fields of unions inside `const` functions.
(active, const_fn_union, "1.27.0", Some(51909), None),

View File

@ -550,7 +550,8 @@ pub fn noop_visit_local<T: MutVisitor>(local: &mut P<Local>, vis: &mut T) {
}
pub fn noop_visit_attribute<T: MutVisitor>(attr: &mut Attribute, vis: &mut T) {
let Attribute { id: _, style: _, path, tokens, is_sugared_doc: _, span } = attr;
let Attribute { item: AttrItem { path, tokens }, id: _, style: _, is_sugared_doc: _, span }
= attr;
vis.visit_path(path);
vis.visit_tts(tokens);
vis.visit_span(span);
@ -681,7 +682,10 @@ pub fn noop_visit_interpolated<T: MutVisitor>(nt: &mut token::Nonterminal, vis:
token::NtIdent(ident, _is_raw) => vis.visit_ident(ident),
token::NtLifetime(ident) => vis.visit_ident(ident),
token::NtLiteral(expr) => vis.visit_expr(expr),
token::NtMeta(meta) => vis.visit_meta_item(meta),
token::NtMeta(AttrItem { path, tokens }) => {
vis.visit_path(path);
vis.visit_tts(tokens);
}
token::NtPath(path) => vis.visit_path(path),
token::NtTT(tt) => vis.visit_tt(tt),
token::NtImplItem(item) =>

View File

@ -90,7 +90,7 @@ impl<'a> Parser<'a> {
debug!("parse_attribute_with_inner_parse_policy: inner_parse_policy={:?} self.token={:?}",
inner_parse_policy,
self.token);
let (span, path, tokens, style) = match self.token.kind {
let (span, item, style) = match self.token.kind {
token::Pound => {
let lo = self.token.span;
self.bump();
@ -107,7 +107,7 @@ impl<'a> Parser<'a> {
};
self.expect(&token::OpenDelim(token::Bracket))?;
let (path, tokens) = self.parse_meta_item_unrestricted()?;
let item = self.parse_attr_item()?;
self.expect(&token::CloseDelim(token::Bracket))?;
let hi = self.prev_span;
@ -142,7 +142,7 @@ impl<'a> Parser<'a> {
}
}
(attr_sp, path, tokens, style)
(attr_sp, item, style)
}
_ => {
let token_str = self.this_token_to_string();
@ -151,10 +151,9 @@ impl<'a> Parser<'a> {
};
Ok(ast::Attribute {
item,
id: attr::mk_attr_id(),
style,
path,
tokens,
is_sugared_doc: false,
span,
})
@ -167,19 +166,19 @@ impl<'a> Parser<'a> {
/// PATH `[` TOKEN_STREAM `]`
/// PATH `{` TOKEN_STREAM `}`
/// PATH
/// PATH `=` TOKEN_TREE
/// PATH `=` UNSUFFIXED_LIT
/// The delimiters or `=` are still put into the resulting token stream.
pub fn parse_meta_item_unrestricted(&mut self) -> PResult<'a, (ast::Path, TokenStream)> {
let meta = match self.token.kind {
pub fn parse_attr_item(&mut self) -> PResult<'a, ast::AttrItem> {
let item = match self.token.kind {
token::Interpolated(ref nt) => match **nt {
Nonterminal::NtMeta(ref meta) => Some(meta.clone()),
Nonterminal::NtMeta(ref item) => Some(item.clone()),
_ => None,
},
_ => None,
};
Ok(if let Some(meta) = meta {
Ok(if let Some(item) = item {
self.bump();
(meta.path, meta.kind.tokens(meta.span))
item
} else {
let path = self.parse_path(PathStyle::Mod)?;
let tokens = if self.check(&token::OpenDelim(DelimToken::Paren)) ||
@ -206,7 +205,7 @@ impl<'a> Parser<'a> {
} else {
TokenStream::empty()
};
(path, tokens)
ast::AttrItem { path, tokens }
})
}
@ -263,7 +262,7 @@ impl<'a> Parser<'a> {
/// Matches the following grammar (per RFC 1559).
///
/// meta_item : IDENT ( '=' UNSUFFIXED_LIT | '(' meta_item_inner? ')' )? ;
/// meta_item : PATH ( '=' UNSUFFIXED_LIT | '(' meta_item_inner? ')' )? ;
/// meta_item_inner : (meta_item | UNSUFFIXED_LIT) (',' meta_item_inner)? ;
pub fn parse_meta_item(&mut self) -> PResult<'a, ast::MetaItem> {
let nt_meta = match self.token.kind {
@ -274,9 +273,14 @@ impl<'a> Parser<'a> {
_ => None,
};
if let Some(meta) = nt_meta {
if let Some(item) = nt_meta {
return match item.meta(item.path.span) {
Some(meta) => {
self.bump();
return Ok(meta);
Ok(meta)
}
None => self.unexpected(),
}
}
let lo = self.token.span;

View File

@ -47,7 +47,7 @@ impl<'a> StringReader<'a> {
source_file: Lrc<syntax_pos::SourceFile>,
override_span: Option<Span>) -> Self {
if source_file.src.is_none() {
sess.span_diagnostic.bug(&format!("Cannot lex source_file without source: {}",
sess.span_diagnostic.bug(&format!("cannot lex `source_file` without source: {}",
source_file.name));
}

View File

@ -18,6 +18,8 @@ type Expected = Option<&'static str>;
/// `Expected` for function and lambda parameter patterns.
pub(super) const PARAM_EXPECTED: Expected = Some("parameter name");
const WHILE_PARSING_OR_MSG: &str = "while parsing this or-pattern starting here";
/// Whether or not an or-pattern should be gated when occurring in the current context.
#[derive(PartialEq)]
pub enum GateOr { Yes, No }
@ -40,7 +42,7 @@ impl<'a> Parser<'a> {
/// Corresponds to `top_pat` in RFC 2535 and allows or-pattern at the top level.
pub(super) fn parse_top_pat(&mut self, gate_or: GateOr) -> PResult<'a, P<Pat>> {
// Allow a '|' before the pats (RFCs 1925, 2530, and 2535).
let gated_leading_vert = self.eat_or_separator() && gate_or == GateOr::Yes;
let gated_leading_vert = self.eat_or_separator(None) && gate_or == GateOr::Yes;
let leading_vert_span = self.prev_span;
// Parse the possibly-or-pattern.
@ -63,7 +65,7 @@ impl<'a> Parser<'a> {
/// Parse the pattern for a function or function pointer parameter.
/// Special recovery is provided for or-patterns and leading `|`.
pub(super) fn parse_fn_param_pat(&mut self) -> PResult<'a, P<Pat>> {
self.recover_leading_vert("not allowed in a parameter pattern");
self.recover_leading_vert(None, "not allowed in a parameter pattern");
let pat = self.parse_pat_with_or(PARAM_EXPECTED, GateOr::No, RecoverComma::No)?;
if let PatKind::Or(..) = &pat.kind {
@ -90,7 +92,7 @@ impl<'a> Parser<'a> {
gate_or: GateOr,
rc: RecoverComma,
) -> PResult<'a, P<Pat>> {
// Parse the first pattern.
// Parse the first pattern (`p_0`).
let first_pat = self.parse_pat(expected)?;
self.maybe_recover_unexpected_comma(first_pat.span, rc)?;
@ -100,11 +102,12 @@ impl<'a> Parser<'a> {
return Ok(first_pat)
}
// Parse the patterns `p_1 | ... | p_n` where `n > 0`.
let lo = first_pat.span;
let mut pats = vec![first_pat];
while self.eat_or_separator() {
while self.eat_or_separator(Some(lo)) {
let pat = self.parse_pat(expected).map_err(|mut err| {
err.span_label(lo, "while parsing this or-pattern starting here");
err.span_label(lo, WHILE_PARSING_OR_MSG);
err
})?;
self.maybe_recover_unexpected_comma(pat.span, rc)?;
@ -122,11 +125,15 @@ impl<'a> Parser<'a> {
/// Eat the or-pattern `|` separator.
/// If instead a `||` token is encountered, recover and pretend we parsed `|`.
fn eat_or_separator(&mut self) -> bool {
fn eat_or_separator(&mut self, lo: Option<Span>) -> bool {
if self.recover_trailing_vert(lo) {
return false;
}
match self.token.kind {
token::OrOr => {
// Found `||`; Recover and pretend we parsed `|`.
self.ban_unexpected_or_or();
self.ban_unexpected_or_or(lo);
self.bump();
true
}
@ -134,16 +141,49 @@ impl<'a> Parser<'a> {
}
}
/// Recover if `|` or `||` is the current token and we have one of the
/// tokens `=>`, `if`, `=`, `:`, `;`, `,`, `]`, `)`, or `}` ahead of us.
///
/// These tokens all indicate that we reached the end of the or-pattern
/// list and can now reliably say that the `|` was an illegal trailing vert.
/// Note that there are more tokens such as `@` for which we know that the `|`
/// is an illegal parse. However, the user's intent is less clear in that case.
fn recover_trailing_vert(&mut self, lo: Option<Span>) -> bool {
let is_end_ahead = self.look_ahead(1, |token| match &token.kind {
token::FatArrow // e.g. `a | => 0,`.
| token::Ident(kw::If, false) // e.g. `a | if expr`.
| token::Eq // e.g. `let a | = 0`.
| token::Semi // e.g. `let a |;`.
| token::Colon // e.g. `let a | :`.
| token::Comma // e.g. `let (a |,)`.
| token::CloseDelim(token::Bracket) // e.g. `let [a | ]`.
| token::CloseDelim(token::Paren) // e.g. `let (a | )`.
| token::CloseDelim(token::Brace) => true, // e.g. `let A { f: a | }`.
_ => false,
});
match (is_end_ahead, &self.token.kind) {
(true, token::BinOp(token::Or)) | (true, token::OrOr) => {
self.ban_illegal_vert(lo, "trailing", "not allowed in an or-pattern");
self.bump();
true
}
_ => false,
}
}
/// We have parsed `||` instead of `|`. Error and suggest `|` instead.
fn ban_unexpected_or_or(&mut self) {
self.struct_span_err(self.token.span, "unexpected token `||` after pattern")
.span_suggestion(
fn ban_unexpected_or_or(&mut self, lo: Option<Span>) {
let mut err = self.struct_span_err(self.token.span, "unexpected token `||` after pattern");
err.span_suggestion(
self.token.span,
"use a single `|` to separate multiple alternative patterns",
"|".to_owned(),
Applicability::MachineApplicable
)
.emit();
);
if let Some(lo) = lo {
err.span_label(lo, WHILE_PARSING_OR_MSG);
}
err.emit();
}
/// Some special error handling for the "top-level" patterns in a match arm,
@ -198,25 +238,38 @@ impl<'a> Parser<'a> {
/// Recursive possibly-or-pattern parser with recovery for an erroneous leading `|`.
/// See `parse_pat_with_or` for details on parsing or-patterns.
fn parse_pat_with_or_inner(&mut self) -> PResult<'a, P<Pat>> {
self.recover_leading_vert("only allowed in a top-level pattern");
self.recover_leading_vert(None, "only allowed in a top-level pattern");
self.parse_pat_with_or(None, GateOr::Yes, RecoverComma::No)
}
/// Recover if `|` or `||` is here.
/// The user is thinking that a leading `|` is allowed in this position.
fn recover_leading_vert(&mut self, ctx: &str) {
fn recover_leading_vert(&mut self, lo: Option<Span>, ctx: &str) {
if let token::BinOp(token::Or) | token::OrOr = self.token.kind {
let span = self.token.span;
let rm_msg = format!("remove the `{}`", pprust::token_to_string(&self.token));
self.struct_span_err(span, &format!("a leading `|` is {}", ctx))
.span_suggestion(span, &rm_msg, String::new(), Applicability::MachineApplicable)
.emit();
self.ban_illegal_vert(lo, "leading", ctx);
self.bump();
}
}
/// A `|` or possibly `||` token shouldn't be here. Ban it.
fn ban_illegal_vert(&mut self, lo: Option<Span>, pos: &str, ctx: &str) {
let span = self.token.span;
let mut err = self.struct_span_err(span, &format!("a {} `|` is {}", pos, ctx));
err.span_suggestion(
span,
&format!("remove the `{}`", pprust::token_to_string(&self.token)),
String::new(),
Applicability::MachineApplicable,
);
if let Some(lo) = lo {
err.span_label(lo, WHILE_PARSING_OR_MSG);
}
if let token::OrOr = self.token.kind {
err.note("alternatives in or-patterns are separated with `|`, not `||`");
}
err.emit();
}
/// Parses a pattern, with a setting whether modern range patterns (e.g., `a..=b`, `a..b` are
/// allowed).
fn parse_pat_with_range_pat(
@ -259,7 +312,7 @@ impl<'a> Parser<'a> {
self.bump();
self.parse_pat_range_to(RangeEnd::Included(RangeSyntax::DotDotDot), "...")?
}
// At this point, token != &, &&, (, [
// At this point, token != `&`, `&&`, `(`, `[`, `..`, `..=`, or `...`.
_ => if self.eat_keyword(kw::Underscore) {
// Parse _
PatKind::Wild

View File

@ -114,9 +114,9 @@ impl<'a> Parser<'a> {
pub fn parse_path_allowing_meta(&mut self, style: PathStyle) -> PResult<'a, Path> {
let meta_ident = match self.token.kind {
token::Interpolated(ref nt) => match **nt {
token::NtMeta(ref meta) => match meta.kind {
ast::MetaItemKind::Word => Some(meta.path.clone()),
_ => None,
token::NtMeta(ref item) => match item.tokens.is_empty() {
true => Some(item.path.clone()),
false => None,
},
_ => None,
},

View File

@ -687,7 +687,7 @@ pub enum Nonterminal {
NtLifetime(ast::Ident),
NtLiteral(P<ast::Expr>),
/// Stuff inside brackets for attributes
NtMeta(ast::MetaItem),
NtMeta(ast::AttrItem),
NtPath(ast::Path),
NtVis(ast::Visibility),
NtTT(TokenTree),

View File

@ -324,7 +324,7 @@ fn token_to_string_ext(token: &Token, convert_dollar_crate: bool) -> String {
crate fn nonterminal_to_string(nt: &Nonterminal) -> String {
match *nt {
token::NtExpr(ref e) => expr_to_string(e),
token::NtMeta(ref e) => meta_item_to_string(e),
token::NtMeta(ref e) => attr_item_to_string(e),
token::NtTy(ref e) => ty_to_string(e),
token::NtPath(ref e) => path_to_string(e),
token::NtItem(ref e) => item_to_string(e),
@ -412,8 +412,8 @@ pub fn meta_list_item_to_string(li: &ast::NestedMetaItem) -> String {
to_string(|s| s.print_meta_list_item(li))
}
pub fn meta_item_to_string(mi: &ast::MetaItem) -> String {
to_string(|s| s.print_meta_item(mi))
fn attr_item_to_string(ai: &ast::AttrItem) -> String {
to_string(|s| s.print_attr_item(ai, ai.path.span))
}
pub fn attribute_to_string(attr: &ast::Attribute) -> String {
@ -629,24 +629,28 @@ pub trait PrintState<'a>: std::ops::Deref<Target = pp::Printer> + std::ops::Dere
ast::AttrStyle::Inner => self.word("#!["),
ast::AttrStyle::Outer => self.word("#["),
}
self.print_attr_item(&attr.item, attr.span);
self.word("]");
}
}
fn print_attr_item(&mut self, item: &ast::AttrItem, span: Span) {
self.ibox(0);
match attr.tokens.trees().next() {
match item.tokens.trees().next() {
Some(TokenTree::Delimited(_, delim, tts)) => {
self.print_mac_common(
Some(MacHeader::Path(&attr.path)), false, None, delim, tts, true, attr.span
Some(MacHeader::Path(&item.path)), false, None, delim, tts, true, span
);
}
tree => {
self.print_path(&attr.path, false, 0);
self.print_path(&item.path, false, 0);
if tree.is_some() {
self.space();
self.print_tts(attr.tokens.clone(), true);
self.print_tts(item.tokens.clone(), true);
}
}
}
self.end();
self.word("]");
}
}
fn print_meta_list_item(&mut self, item: &ast::NestedMetaItem) {

View File

@ -1,6 +1,6 @@
//! Attributes injected into the crate root from command line using `-Z crate-attr`.
use syntax::ast::{self, AttrStyle};
use syntax::ast::{self, AttrItem, AttrStyle};
use syntax::attr::mk_attr;
use syntax::panictry;
use syntax::parse::{self, token, ParseSess};
@ -15,7 +15,7 @@ pub fn inject(mut krate: ast::Crate, parse_sess: &ParseSess, attrs: &[String]) -
);
let start_span = parser.token.span;
let (path, tokens) = panictry!(parser.parse_meta_item_unrestricted());
let AttrItem { path, tokens } = panictry!(parser.parse_attr_item());
let end_span = parser.token.span;
if parser.token != token::Eof {
parse_sess.span_diagnostic

View File

@ -884,7 +884,7 @@ pub struct OffsetOverflowError;
/// A single source in the `SourceMap`.
#[derive(Clone)]
pub struct SourceFile {
/// The name of the file that the source came from, source that doesn't
/// The name of the file that the source came from. Source that doesn't
/// originate from files has names between angle brackets by convention
/// (e.g., `<anon>`).
pub name: FileName,
@ -922,9 +922,9 @@ impl Encodable for SourceFile {
s.emit_struct_field("name", 0, |s| self.name.encode(s))?;
s.emit_struct_field("name_was_remapped", 1, |s| self.name_was_remapped.encode(s))?;
s.emit_struct_field("src_hash", 2, |s| self.src_hash.encode(s))?;
s.emit_struct_field("start_pos", 4, |s| self.start_pos.encode(s))?;
s.emit_struct_field("end_pos", 5, |s| self.end_pos.encode(s))?;
s.emit_struct_field("lines", 6, |s| {
s.emit_struct_field("start_pos", 3, |s| self.start_pos.encode(s))?;
s.emit_struct_field("end_pos", 4, |s| self.end_pos.encode(s))?;
s.emit_struct_field("lines", 5, |s| {
let lines = &self.lines[..];
// Store the length.
s.emit_u32(lines.len() as u32)?;
@ -970,13 +970,13 @@ impl Encodable for SourceFile {
Ok(())
})?;
s.emit_struct_field("multibyte_chars", 7, |s| {
s.emit_struct_field("multibyte_chars", 6, |s| {
self.multibyte_chars.encode(s)
})?;
s.emit_struct_field("non_narrow_chars", 8, |s| {
s.emit_struct_field("non_narrow_chars", 7, |s| {
self.non_narrow_chars.encode(s)
})?;
s.emit_struct_field("name_hash", 9, |s| {
s.emit_struct_field("name_hash", 8, |s| {
self.name_hash.encode(s)
})
})
@ -985,7 +985,6 @@ impl Encodable for SourceFile {
impl Decodable for SourceFile {
fn decode<D: Decoder>(d: &mut D) -> Result<SourceFile, D::Error> {
d.read_struct("SourceFile", 8, |d| {
let name: FileName = d.read_struct_field("name", 0, |d| Decodable::decode(d))?;
let name_was_remapped: bool =
@ -993,9 +992,9 @@ impl Decodable for SourceFile {
let src_hash: u128 =
d.read_struct_field("src_hash", 2, |d| Decodable::decode(d))?;
let start_pos: BytePos =
d.read_struct_field("start_pos", 4, |d| Decodable::decode(d))?;
let end_pos: BytePos = d.read_struct_field("end_pos", 5, |d| Decodable::decode(d))?;
let lines: Vec<BytePos> = d.read_struct_field("lines", 6, |d| {
d.read_struct_field("start_pos", 3, |d| Decodable::decode(d))?;
let end_pos: BytePos = d.read_struct_field("end_pos", 4, |d| Decodable::decode(d))?;
let lines: Vec<BytePos> = d.read_struct_field("lines", 5, |d| {
let num_lines: u32 = Decodable::decode(d)?;
let mut lines = Vec::with_capacity(num_lines as usize);
@ -1024,18 +1023,18 @@ impl Decodable for SourceFile {
Ok(lines)
})?;
let multibyte_chars: Vec<MultiByteChar> =
d.read_struct_field("multibyte_chars", 7, |d| Decodable::decode(d))?;
d.read_struct_field("multibyte_chars", 6, |d| Decodable::decode(d))?;
let non_narrow_chars: Vec<NonNarrowChar> =
d.read_struct_field("non_narrow_chars", 8, |d| Decodable::decode(d))?;
d.read_struct_field("non_narrow_chars", 7, |d| Decodable::decode(d))?;
let name_hash: u128 =
d.read_struct_field("name_hash", 9, |d| Decodable::decode(d))?;
d.read_struct_field("name_hash", 8, |d| Decodable::decode(d))?;
Ok(SourceFile {
name,
name_was_remapped,
unmapped_path: None,
// `crate_of_origin` has to be set by the importer.
// This value matches up with rustc::hir::def_id::INVALID_CRATE.
// That constant is not available here unfortunately :(
// This value matches up with `rustc::hir::def_id::INVALID_CRATE`.
// That constant is not available here, unfortunately.
crate_of_origin: std::u32::MAX - 1,
start_pos,
end_pos,

View File

@ -1,30 +0,0 @@
error[E0658]: macro invocations in `extern {}` blocks are experimental
--> $DIR/macros-in-extern.rs:26:5
|
LL | returns_isize!(rust_get_test_int);
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
= note: for more information, see https://github.com/rust-lang/rust/issues/49476
= help: add `#![feature(macros_in_extern)]` to the crate attributes to enable
error[E0658]: macro invocations in `extern {}` blocks are experimental
--> $DIR/macros-in-extern.rs:28:5
|
LL | takes_u32_returns_u32!(rust_dbg_extern_identity_u32);
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
= note: for more information, see https://github.com/rust-lang/rust/issues/49476
= help: add `#![feature(macros_in_extern)]` to the crate attributes to enable
error[E0658]: macro invocations in `extern {}` blocks are experimental
--> $DIR/macros-in-extern.rs:30:5
|
LL | emits_nothing!();
| ^^^^^^^^^^^^^^^^^
|
= note: for more information, see https://github.com/rust-lang/rust/issues/49476
= help: add `#![feature(macros_in_extern)]` to the crate attributes to enable
error: aborting due to 3 previous errors
For more information about this error, try `rustc --explain E0658`.

View File

@ -1,112 +0,0 @@
// force-host
// no-prefer-dynamic
// Proc macros commonly used by tests.
// `panic`/`print` -> `panic_bang`/`print_bang` to avoid conflicts with standard macros.
#![crate_type = "proc-macro"]
extern crate proc_macro;
use proc_macro::TokenStream;
// Macro that return empty token stream.
#[proc_macro]
pub fn empty(_: TokenStream) -> TokenStream {
TokenStream::new()
}
#[proc_macro_attribute]
pub fn empty_attr(_: TokenStream, _: TokenStream) -> TokenStream {
TokenStream::new()
}
#[proc_macro_derive(Empty, attributes(empty_helper))]
pub fn empty_derive(_: TokenStream) -> TokenStream {
TokenStream::new()
}
// Macro that panics.
#[proc_macro]
pub fn panic_bang(_: TokenStream) -> TokenStream {
panic!("panic-bang");
}
#[proc_macro_attribute]
pub fn panic_attr(_: TokenStream, _: TokenStream) -> TokenStream {
panic!("panic-attr");
}
#[proc_macro_derive(Panic, attributes(panic_helper))]
pub fn panic_derive(_: TokenStream) -> TokenStream {
panic!("panic-derive");
}
// Macros that return the input stream.
#[proc_macro]
pub fn identity(input: TokenStream) -> TokenStream {
input
}
#[proc_macro_attribute]
pub fn identity_attr(_: TokenStream, input: TokenStream) -> TokenStream {
input
}
#[proc_macro_derive(Identity, attributes(identity_helper))]
pub fn identity_derive(input: TokenStream) -> TokenStream {
input
}
// Macros that iterate and re-collect the input stream.
#[proc_macro]
pub fn recollect(input: TokenStream) -> TokenStream {
input.into_iter().collect()
}
#[proc_macro_attribute]
pub fn recollect_attr(_: TokenStream, input: TokenStream) -> TokenStream {
input.into_iter().collect()
}
#[proc_macro_derive(Recollect, attributes(recollect_helper))]
pub fn recollect_derive(input: TokenStream) -> TokenStream {
input.into_iter().collect()
}
// Macros that print their input in the original and re-collected forms (if they differ).
fn print_helper(input: TokenStream, kind: &str) -> TokenStream {
let input_display = format!("{}", input);
let input_debug = format!("{:#?}", input);
let recollected = input.into_iter().collect();
let recollected_display = format!("{}", recollected);
let recollected_debug = format!("{:#?}", recollected);
println!("PRINT-{} INPUT (DISPLAY): {}", kind, input_display);
if recollected_display != input_display {
println!("PRINT-{} RE-COLLECTED (DISPLAY): {}", kind, recollected_display);
}
println!("PRINT-{} INPUT (DEBUG): {}", kind, input_debug);
if recollected_debug != input_debug {
println!("PRINT-{} RE-COLLECTED (DEBUG): {}", kind, recollected_debug);
}
recollected
}
#[proc_macro]
pub fn print_bang(input: TokenStream) -> TokenStream {
print_helper(input, "BANG")
}
#[proc_macro_attribute]
pub fn print_attr(_: TokenStream, input: TokenStream) -> TokenStream {
print_helper(input, "ATTR")
}
#[proc_macro_derive(Print, attributes(print_helper))]
pub fn print_derive(input: TokenStream) -> TokenStream {
print_helper(input, "DERIVE")
}

View File

@ -1,30 +0,0 @@
error[E0658]: macro invocations in `extern {}` blocks are experimental
--> $DIR/macros-in-extern.rs:14:5
|
LL | #[empty_attr]
| ^^^^^^^^^^^^^
|
= note: for more information, see https://github.com/rust-lang/rust/issues/49476
= help: add `#![feature(macros_in_extern)]` to the crate attributes to enable
error[E0658]: macro invocations in `extern {}` blocks are experimental
--> $DIR/macros-in-extern.rs:18:5
|
LL | #[identity_attr]
| ^^^^^^^^^^^^^^^^
|
= note: for more information, see https://github.com/rust-lang/rust/issues/49476
= help: add `#![feature(macros_in_extern)]` to the crate attributes to enable
error[E0658]: macro invocations in `extern {}` blocks are experimental
--> $DIR/macros-in-extern.rs:22:5
|
LL | identity!(fn rust_dbg_extern_identity_u32(arg: u32) -> u32;);
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
= note: for more information, see https://github.com/rust-lang/rust/issues/49476
= help: add `#![feature(macros_in_extern)]` to the crate attributes to enable
error: aborting due to 3 previous errors
For more information about this error, try `rustc --explain E0658`.

View File

@ -1,26 +1,15 @@
fn main() {
f1(|_: (), _: ()| {}); //~ ERROR type mismatch
//~^ ERROR type mismatch
f2(|_: (), _: ()| {}); //~ ERROR type mismatch
//~^ ERROR type mismatch
f3(|_: (), _: ()| {}); //~ ERROR type mismatch
//~^ ERROR type mismatch
f4(|_: (), _: ()| {}); //~ ERROR type mismatch
//~^ ERROR type mismatch
f5(|_: (), _: ()| {}); //~ ERROR type mismatch
//~^ ERROR type mismatch
g1(|_: (), _: ()| {}); //~ ERROR type mismatch
//~^ ERROR type mismatch
g2(|_: (), _: ()| {}); //~ ERROR type mismatch
//~^ ERROR type mismatch
g3(|_: (), _: ()| {}); //~ ERROR type mismatch
//~^ ERROR type mismatch
g4(|_: (), _: ()| {}); //~ ERROR type mismatch
//~^ ERROR type mismatch
h1(|_: (), _: (), _: (), _: ()| {}); //~ ERROR type mismatch
//~^ ERROR type mismatch
h2(|_: (), _: (), _: (), _: ()| {}); //~ ERROR type mismatch
//~^ ERROR type mismatch
}
// Basic

View File

@ -10,18 +10,7 @@ LL | fn f1<F>(_: F) where F: Fn(&(), &()) {}
| -- ------------ required by this bound in `f1`
error[E0631]: type mismatch in closure arguments
--> $DIR/anonymous-higher-ranked-lifetime.rs:2:5
|
LL | f1(|_: (), _: ()| {});
| ^^ -------------- found signature of `fn((), ()) -> _`
| |
| expected signature of `fn(&(), &()) -> _`
...
LL | fn f1<F>(_: F) where F: Fn(&(), &()) {}
| -- ------------ required by this bound in `f1`
error[E0631]: type mismatch in closure arguments
--> $DIR/anonymous-higher-ranked-lifetime.rs:4:5
--> $DIR/anonymous-higher-ranked-lifetime.rs:3:5
|
LL | f2(|_: (), _: ()| {});
| ^^ -------------- found signature of `fn((), ()) -> _`
@ -34,17 +23,6 @@ LL | fn f2<F>(_: F) where F: for<'a> Fn(&'a (), &()) {}
error[E0631]: type mismatch in closure arguments
--> $DIR/anonymous-higher-ranked-lifetime.rs:4:5
|
LL | f2(|_: (), _: ()| {});
| ^^ -------------- found signature of `fn((), ()) -> _`
| |
| expected signature of `fn(&'a (), &()) -> _`
...
LL | fn f2<F>(_: F) where F: for<'a> Fn(&'a (), &()) {}
| -- --------------- required by this bound in `f2`
error[E0631]: type mismatch in closure arguments
--> $DIR/anonymous-higher-ranked-lifetime.rs:6:5
|
LL | f3(|_: (), _: ()| {});
| ^^ -------------- found signature of `fn((), ()) -> _`
| |
@ -54,18 +32,7 @@ LL | fn f3<'a, F>(_: F) where F: Fn(&'a (), &()) {}
| -- --------------- required by this bound in `f3`
error[E0631]: type mismatch in closure arguments
--> $DIR/anonymous-higher-ranked-lifetime.rs:6:5
|
LL | f3(|_: (), _: ()| {});
| ^^ -------------- found signature of `fn((), ()) -> _`
| |
| expected signature of `fn(&(), &()) -> _`
...
LL | fn f3<'a, F>(_: F) where F: Fn(&'a (), &()) {}
| -- --------------- required by this bound in `f3`
error[E0631]: type mismatch in closure arguments
--> $DIR/anonymous-higher-ranked-lifetime.rs:8:5
--> $DIR/anonymous-higher-ranked-lifetime.rs:5:5
|
LL | f4(|_: (), _: ()| {});
| ^^ -------------- found signature of `fn((), ()) -> _`
@ -76,18 +43,7 @@ LL | fn f4<F>(_: F) where F: for<'r> Fn(&(), &'r ()) {}
| -- ----------------------- required by this bound in `f4`
error[E0631]: type mismatch in closure arguments
--> $DIR/anonymous-higher-ranked-lifetime.rs:8:5
|
LL | f4(|_: (), _: ()| {});
| ^^ -------------- found signature of `fn((), ()) -> _`
| |
| expected signature of `fn(&(), &'r ()) -> _`
...
LL | fn f4<F>(_: F) where F: for<'r> Fn(&(), &'r ()) {}
| -- --------------- required by this bound in `f4`
error[E0631]: type mismatch in closure arguments
--> $DIR/anonymous-higher-ranked-lifetime.rs:10:5
--> $DIR/anonymous-higher-ranked-lifetime.rs:6:5
|
LL | f5(|_: (), _: ()| {});
| ^^ -------------- found signature of `fn((), ()) -> _`
@ -98,18 +54,7 @@ LL | fn f5<F>(_: F) where F: for<'r> Fn(&'r (), &'r ()) {}
| -- -------------------------- required by this bound in `f5`
error[E0631]: type mismatch in closure arguments
--> $DIR/anonymous-higher-ranked-lifetime.rs:10:5
|
LL | f5(|_: (), _: ()| {});
| ^^ -------------- found signature of `fn((), ()) -> _`
| |
| expected signature of `fn(&'r (), &'r ()) -> _`
...
LL | fn f5<F>(_: F) where F: for<'r> Fn(&'r (), &'r ()) {}
| -- ------------------ required by this bound in `f5`
error[E0631]: type mismatch in closure arguments
--> $DIR/anonymous-higher-ranked-lifetime.rs:12:5
--> $DIR/anonymous-higher-ranked-lifetime.rs:7:5
|
LL | g1(|_: (), _: ()| {});
| ^^ -------------- found signature of `fn((), ()) -> _`
@ -120,18 +65,7 @@ LL | fn g1<F>(_: F) where F: Fn(&(), Box<dyn Fn(&())>) {}
| -- ------------------------- required by this bound in `g1`
error[E0631]: type mismatch in closure arguments
--> $DIR/anonymous-higher-ranked-lifetime.rs:12:5
|
LL | g1(|_: (), _: ()| {});
| ^^ -------------- found signature of `fn((), ()) -> _`
| |
| expected signature of `fn(&(), std::boxed::Box<(dyn for<'r> std::ops::Fn(&'r ()) + 'static)>) -> _`
...
LL | fn g1<F>(_: F) where F: Fn(&(), Box<dyn Fn(&())>) {}
| -- ------------------------- required by this bound in `g1`
error[E0631]: type mismatch in closure arguments
--> $DIR/anonymous-higher-ranked-lifetime.rs:14:5
--> $DIR/anonymous-higher-ranked-lifetime.rs:8:5
|
LL | g2(|_: (), _: ()| {});
| ^^ -------------- found signature of `fn((), ()) -> _`
@ -142,18 +76,7 @@ LL | fn g2<F>(_: F) where F: Fn(&(), fn(&())) {}
| -- ---------------- required by this bound in `g2`
error[E0631]: type mismatch in closure arguments
--> $DIR/anonymous-higher-ranked-lifetime.rs:14:5
|
LL | g2(|_: (), _: ()| {});
| ^^ -------------- found signature of `fn((), ()) -> _`
| |
| expected signature of `fn(&(), for<'r> fn(&'r ())) -> _`
...
LL | fn g2<F>(_: F) where F: Fn(&(), fn(&())) {}
| -- ---------------- required by this bound in `g2`
error[E0631]: type mismatch in closure arguments
--> $DIR/anonymous-higher-ranked-lifetime.rs:16:5
--> $DIR/anonymous-higher-ranked-lifetime.rs:9:5
|
LL | g3(|_: (), _: ()| {});
| ^^ -------------- found signature of `fn((), ()) -> _`
@ -164,18 +87,7 @@ LL | fn g3<F>(_: F) where F: for<'s> Fn(&'s (), Box<dyn Fn(&())>) {}
| -- ------------------------------------ required by this bound in `g3`
error[E0631]: type mismatch in closure arguments
--> $DIR/anonymous-higher-ranked-lifetime.rs:16:5
|
LL | g3(|_: (), _: ()| {});
| ^^ -------------- found signature of `fn((), ()) -> _`
| |
| expected signature of `fn(&'s (), std::boxed::Box<(dyn for<'r> std::ops::Fn(&'r ()) + 'static)>) -> _`
...
LL | fn g3<F>(_: F) where F: for<'s> Fn(&'s (), Box<dyn Fn(&())>) {}
| -- ---------------------------- required by this bound in `g3`
error[E0631]: type mismatch in closure arguments
--> $DIR/anonymous-higher-ranked-lifetime.rs:18:5
--> $DIR/anonymous-higher-ranked-lifetime.rs:10:5
|
LL | g4(|_: (), _: ()| {});
| ^^ -------------- found signature of `fn((), ()) -> _`
@ -186,18 +98,7 @@ LL | fn g4<F>(_: F) where F: Fn(&(), for<'r> fn(&'r ())) {}
| -- --------------------------- required by this bound in `g4`
error[E0631]: type mismatch in closure arguments
--> $DIR/anonymous-higher-ranked-lifetime.rs:18:5
|
LL | g4(|_: (), _: ()| {});
| ^^ -------------- found signature of `fn((), ()) -> _`
| |
| expected signature of `fn(&(), for<'r> fn(&'r ())) -> _`
...
LL | fn g4<F>(_: F) where F: Fn(&(), for<'r> fn(&'r ())) {}
| -- --------------------------- required by this bound in `g4`
error[E0631]: type mismatch in closure arguments
--> $DIR/anonymous-higher-ranked-lifetime.rs:20:5
--> $DIR/anonymous-higher-ranked-lifetime.rs:11:5
|
LL | h1(|_: (), _: (), _: (), _: ()| {});
| ^^ ---------------------------- found signature of `fn((), (), (), ()) -> _`
@ -208,18 +109,7 @@ LL | fn h1<F>(_: F) where F: Fn(&(), Box<dyn Fn(&())>, &(), fn(&(), &())) {}
| -- -------------------------------------------- required by this bound in `h1`
error[E0631]: type mismatch in closure arguments
--> $DIR/anonymous-higher-ranked-lifetime.rs:20:5
|
LL | h1(|_: (), _: (), _: (), _: ()| {});
| ^^ ---------------------------- found signature of `fn((), (), (), ()) -> _`
| |
| expected signature of `fn(&(), std::boxed::Box<(dyn for<'r> std::ops::Fn(&'r ()) + 'static)>, &(), for<'r, 's> fn(&'r (), &'s ())) -> _`
...
LL | fn h1<F>(_: F) where F: Fn(&(), Box<dyn Fn(&())>, &(), fn(&(), &())) {}
| -- -------------------------------------------- required by this bound in `h1`
error[E0631]: type mismatch in closure arguments
--> $DIR/anonymous-higher-ranked-lifetime.rs:22:5
--> $DIR/anonymous-higher-ranked-lifetime.rs:12:5
|
LL | h2(|_: (), _: (), _: (), _: ()| {});
| ^^ ---------------------------- found signature of `fn((), (), (), ()) -> _`
@ -229,16 +119,5 @@ LL | h2(|_: (), _: (), _: (), _: ()| {});
LL | fn h2<F>(_: F) where F: for<'t0> Fn(&(), Box<dyn Fn(&())>, &'t0 (), fn(&(), &())) {}
| -- --------------------------------------------------------- required by this bound in `h2`
error[E0631]: type mismatch in closure arguments
--> $DIR/anonymous-higher-ranked-lifetime.rs:22:5
|
LL | h2(|_: (), _: (), _: (), _: ()| {});
| ^^ ---------------------------- found signature of `fn((), (), (), ()) -> _`
| |
| expected signature of `fn(&(), std::boxed::Box<(dyn for<'r> std::ops::Fn(&'r ()) + 'static)>, &'t0 (), for<'r, 's> fn(&'r (), &'s ())) -> _`
...
LL | fn h2<F>(_: F) where F: for<'t0> Fn(&(), Box<dyn Fn(&())>, &'t0 (), fn(&(), &())) {}
| -- ------------------------------------------------ required by this bound in `h2`
error: aborting due to 22 previous errors
error: aborting due to 11 previous errors

View File

@ -9,9 +9,9 @@ LL | assert_send(local_dropped_before_await());
|
= help: within `impl std::future::Future`, the trait `std::marker::Send` is not implemented for `std::rc::Rc<()>`
= note: required because it appears within the type `impl std::fmt::Debug`
= note: required because it appears within the type `{impl std::fmt::Debug, fn() -> impl std::future::Future {fut}, impl std::future::Future, ()}`
= note: required because it appears within the type `[static generator@$DIR/async-fn-nonsend.rs:21:39: 26:2 {impl std::fmt::Debug, fn() -> impl std::future::Future {fut}, impl std::future::Future, ()}]`
= note: required because it appears within the type `std::future::GenFuture<[static generator@$DIR/async-fn-nonsend.rs:21:39: 26:2 {impl std::fmt::Debug, fn() -> impl std::future::Future {fut}, impl std::future::Future, ()}]>`
= note: required because it appears within the type `{impl std::fmt::Debug, fn() -> impl std::future::Future {fut}, impl std::future::Future, impl std::future::Future, ()}`
= note: required because it appears within the type `[static generator@$DIR/async-fn-nonsend.rs:21:39: 26:2 {impl std::fmt::Debug, fn() -> impl std::future::Future {fut}, impl std::future::Future, impl std::future::Future, ()}]`
= note: required because it appears within the type `std::future::GenFuture<[static generator@$DIR/async-fn-nonsend.rs:21:39: 26:2 {impl std::fmt::Debug, fn() -> impl std::future::Future {fut}, impl std::future::Future, impl std::future::Future, ()}]>`
= note: required because it appears within the type `impl std::future::Future`
= note: required because it appears within the type `impl std::future::Future`
@ -26,9 +26,9 @@ LL | assert_send(non_send_temporary_in_match());
|
= help: within `impl std::future::Future`, the trait `std::marker::Send` is not implemented for `std::rc::Rc<()>`
= note: required because it appears within the type `impl std::fmt::Debug`
= note: required because it appears within the type `{fn(impl std::fmt::Debug) -> std::option::Option<impl std::fmt::Debug> {std::option::Option::<impl std::fmt::Debug>::Some}, fn() -> impl std::fmt::Debug {non_send}, impl std::fmt::Debug, std::option::Option<impl std::fmt::Debug>, fn() -> impl std::future::Future {fut}, impl std::future::Future, ()}`
= note: required because it appears within the type `[static generator@$DIR/async-fn-nonsend.rs:28:40: 37:2 {fn(impl std::fmt::Debug) -> std::option::Option<impl std::fmt::Debug> {std::option::Option::<impl std::fmt::Debug>::Some}, fn() -> impl std::fmt::Debug {non_send}, impl std::fmt::Debug, std::option::Option<impl std::fmt::Debug>, fn() -> impl std::future::Future {fut}, impl std::future::Future, ()}]`
= note: required because it appears within the type `std::future::GenFuture<[static generator@$DIR/async-fn-nonsend.rs:28:40: 37:2 {fn(impl std::fmt::Debug) -> std::option::Option<impl std::fmt::Debug> {std::option::Option::<impl std::fmt::Debug>::Some}, fn() -> impl std::fmt::Debug {non_send}, impl std::fmt::Debug, std::option::Option<impl std::fmt::Debug>, fn() -> impl std::future::Future {fut}, impl std::future::Future, ()}]>`
= note: required because it appears within the type `{fn(impl std::fmt::Debug) -> std::option::Option<impl std::fmt::Debug> {std::option::Option::<impl std::fmt::Debug>::Some}, fn() -> impl std::fmt::Debug {non_send}, impl std::fmt::Debug, std::option::Option<impl std::fmt::Debug>, fn() -> impl std::future::Future {fut}, impl std::future::Future, impl std::future::Future, ()}`
= note: required because it appears within the type `[static generator@$DIR/async-fn-nonsend.rs:28:40: 37:2 {fn(impl std::fmt::Debug) -> std::option::Option<impl std::fmt::Debug> {std::option::Option::<impl std::fmt::Debug>::Some}, fn() -> impl std::fmt::Debug {non_send}, impl std::fmt::Debug, std::option::Option<impl std::fmt::Debug>, fn() -> impl std::future::Future {fut}, impl std::future::Future, impl std::future::Future, ()}]`
= note: required because it appears within the type `std::future::GenFuture<[static generator@$DIR/async-fn-nonsend.rs:28:40: 37:2 {fn(impl std::fmt::Debug) -> std::option::Option<impl std::fmt::Debug> {std::option::Option::<impl std::fmt::Debug>::Some}, fn() -> impl std::fmt::Debug {non_send}, impl std::fmt::Debug, std::option::Option<impl std::fmt::Debug>, fn() -> impl std::future::Future {fut}, impl std::future::Future, impl std::future::Future, ()}]>`
= note: required because it appears within the type `impl std::future::Future`
= note: required because it appears within the type `impl std::future::Future`
@ -45,9 +45,9 @@ LL | assert_send(non_sync_with_method_call());
= note: required because of the requirements on the impl of `std::marker::Send` for `&mut dyn std::fmt::Write`
= note: required because it appears within the type `std::fmt::Formatter<'_>`
= note: required because of the requirements on the impl of `std::marker::Send` for `&mut std::fmt::Formatter<'_>`
= note: required because it appears within the type `for<'r, 's> {&'r mut std::fmt::Formatter<'s>, bool, fn() -> impl std::future::Future {fut}, impl std::future::Future, ()}`
= note: required because it appears within the type `[static generator@$DIR/async-fn-nonsend.rs:39:38: 45:2 for<'r, 's> {&'r mut std::fmt::Formatter<'s>, bool, fn() -> impl std::future::Future {fut}, impl std::future::Future, ()}]`
= note: required because it appears within the type `std::future::GenFuture<[static generator@$DIR/async-fn-nonsend.rs:39:38: 45:2 for<'r, 's> {&'r mut std::fmt::Formatter<'s>, bool, fn() -> impl std::future::Future {fut}, impl std::future::Future, ()}]>`
= note: required because it appears within the type `for<'r, 's> {&'r mut std::fmt::Formatter<'s>, bool, bool, fn() -> impl std::future::Future {fut}, impl std::future::Future, impl std::future::Future, ()}`
= note: required because it appears within the type `[static generator@$DIR/async-fn-nonsend.rs:39:38: 45:2 for<'r, 's> {&'r mut std::fmt::Formatter<'s>, bool, bool, fn() -> impl std::future::Future {fut}, impl std::future::Future, impl std::future::Future, ()}]`
= note: required because it appears within the type `std::future::GenFuture<[static generator@$DIR/async-fn-nonsend.rs:39:38: 45:2 for<'r, 's> {&'r mut std::fmt::Formatter<'s>, bool, bool, fn() -> impl std::future::Future {fut}, impl std::future::Future, impl std::future::Future, ()}]>`
= note: required because it appears within the type `impl std::future::Future`
= note: required because it appears within the type `impl std::future::Future`
@ -68,9 +68,9 @@ LL | assert_send(non_sync_with_method_call());
= note: required because of the requirements on the impl of `std::marker::Send` for `std::slice::Iter<'_, std::fmt::ArgumentV1<'_>>`
= note: required because it appears within the type `std::fmt::Formatter<'_>`
= note: required because of the requirements on the impl of `std::marker::Send` for `&mut std::fmt::Formatter<'_>`
= note: required because it appears within the type `for<'r, 's> {&'r mut std::fmt::Formatter<'s>, bool, fn() -> impl std::future::Future {fut}, impl std::future::Future, ()}`
= note: required because it appears within the type `[static generator@$DIR/async-fn-nonsend.rs:39:38: 45:2 for<'r, 's> {&'r mut std::fmt::Formatter<'s>, bool, fn() -> impl std::future::Future {fut}, impl std::future::Future, ()}]`
= note: required because it appears within the type `std::future::GenFuture<[static generator@$DIR/async-fn-nonsend.rs:39:38: 45:2 for<'r, 's> {&'r mut std::fmt::Formatter<'s>, bool, fn() -> impl std::future::Future {fut}, impl std::future::Future, ()}]>`
= note: required because it appears within the type `for<'r, 's> {&'r mut std::fmt::Formatter<'s>, bool, bool, fn() -> impl std::future::Future {fut}, impl std::future::Future, impl std::future::Future, ()}`
= note: required because it appears within the type `[static generator@$DIR/async-fn-nonsend.rs:39:38: 45:2 for<'r, 's> {&'r mut std::fmt::Formatter<'s>, bool, bool, fn() -> impl std::future::Future {fut}, impl std::future::Future, impl std::future::Future, ()}]`
= note: required because it appears within the type `std::future::GenFuture<[static generator@$DIR/async-fn-nonsend.rs:39:38: 45:2 for<'r, 's> {&'r mut std::fmt::Formatter<'s>, bool, bool, fn() -> impl std::future::Future {fut}, impl std::future::Future, impl std::future::Future, ()}]>`
= note: required because it appears within the type `impl std::future::Future`
= note: required because it appears within the type `impl std::future::Future`

View File

@ -0,0 +1,25 @@
// edition:2018
use std::sync::Mutex;
fn is_send<T: Send>(t: T) {
}
async fn foo() {
bar(&Mutex::new(22)).await;
}
async fn bar(x: &Mutex<u32>) {
let g = x.lock().unwrap();
baz().await;
}
async fn baz() {
}
fn main() {
is_send(foo());
//~^ ERROR `std::sync::MutexGuard<'_, u32>` cannot be sent between threads safely [E0277]
}

View File

@ -0,0 +1,23 @@
error[E0277]: `std::sync::MutexGuard<'_, u32>` cannot be sent between threads safely
--> $DIR/issue-64130-non-send-future-diags.rs:23:5
|
LL | fn is_send<T: Send>(t: T) {
| ------- ---- required by this bound in `is_send`
...
LL | is_send(foo());
| ^^^^^^^ `std::sync::MutexGuard<'_, u32>` cannot be sent between threads safely
|
= help: within `impl std::future::Future`, the trait `std::marker::Send` is not implemented for `std::sync::MutexGuard<'_, u32>`
note: future does not implement `std::marker::Send` as this value is used across an await
--> $DIR/issue-64130-non-send-future-diags.rs:15:5
|
LL | let g = x.lock().unwrap();
| - has type `std::sync::MutexGuard<'_, u32>`
LL | baz().await;
| ^^^^^^^^^^^ await occurs here, with `g` maybe used later
LL | }
| - `g` is later dropped here
error: aborting due to previous error
For more information about this error, try `rustc --explain E0277`.

View File

@ -0,0 +1,12 @@
// edition:2018
#![deny(unreachable_code)]
async fn foo() {
return; bar().await;
//~^ ERROR unreachable statement
}
async fn bar() {
}
fn main() { }

View File

@ -0,0 +1,16 @@
error: unreachable statement
--> $DIR/unreachable-lint-1.rs:5:13
|
LL | return; bar().await;
| ------ ^^^^^^^^^^^^ unreachable statement
| |
| any code following this expression is unreachable
|
note: lint level defined here
--> $DIR/unreachable-lint-1.rs:2:9
|
LL | #![deny(unreachable_code)]
| ^^^^^^^^^^^^^^^^
error: aborting due to previous error

View File

@ -0,0 +1,13 @@
// check-pass
// edition:2018
#![deny(unreachable_code)]
async fn foo() {
endless().await;
}
async fn endless() -> ! {
loop {}
}
fn main() { }

View File

@ -57,7 +57,7 @@ fn main() {
// check that macro expanded code works
macro_rules! if_cfg {
($cfg:meta $ib:block else $eb:block) => {
($cfg:meta? $ib:block else $eb:block) => {
{
let r;
#[cfg($cfg)]
@ -69,7 +69,7 @@ fn main() {
}
}
let n = if_cfg!(unset {
let n = if_cfg!(unset? {
413
} else {
612

View File

@ -5,7 +5,7 @@ LL | s.the_fn();
| ^^^^^^ method not found in `&Lib::TheStruct`
|
= help: items from traits can only be used if the trait is in scope
= note: the following trait is implemented but not in scope, perhaps add a `use` for it:
= note: the following trait is implemented but not in scope; perhaps add a `use` for it:
`use Lib::TheTrait;`
error: aborting due to previous error

View File

@ -5,7 +5,7 @@ LL | s.the_fn();
| ^^^^^^ method not found in `&Lib::TheStruct`
|
= help: items from traits can only be used if the trait is in scope
= note: the following trait is implemented but not in scope, perhaps add a `use` for it:
= note: the following trait is implemented but not in scope; perhaps add a `use` for it:
`use Lib::TheTrait;`
error: aborting due to previous error

View File

@ -5,7 +5,7 @@ LL | s.the_fn();
| ^^^^^^ method not found in `&coherence_inherent_cc_lib::TheStruct`
|
= help: items from traits can only be used if the trait is in scope
= note: the following trait is implemented but not in scope, perhaps add a `use` for it:
= note: the following trait is implemented but not in scope; perhaps add a `use` for it:
`use coherence_inherent_cc_lib::TheTrait;`
error: aborting due to previous error

View File

@ -5,7 +5,7 @@ LL | s.the_fn();
| ^^^^^^ method not found in `&coherence_inherent_cc_lib::TheStruct`
|
= help: items from traits can only be used if the trait is in scope
= note: the following trait is implemented but not in scope, perhaps add a `use` for it:
= note: the following trait is implemented but not in scope; perhaps add a `use` for it:
`use coherence_inherent_cc_lib::TheTrait;`
error: aborting due to previous error

View File

@ -1,27 +0,0 @@
#![feature(decl_macro)]
macro_rules! returns_isize(
($ident:ident) => (
fn $ident() -> isize;
)
);
macro takes_u32_returns_u32($ident:ident) {
fn $ident (arg: u32) -> u32;
}
macro_rules! emits_nothing(
() => ()
);
#[link(name = "rust_test_helpers", kind = "static")]
extern {
returns_isize!(rust_get_test_int);
//~^ ERROR macro invocations in `extern {}` blocks are experimental
takes_u32_returns_u32!(rust_dbg_extern_identity_u32);
//~^ ERROR macro invocations in `extern {}` blocks are experimental
emits_nothing!();
//~^ ERROR macro invocations in `extern {}` blocks are experimental
}
fn main() {}

View File

@ -1,30 +0,0 @@
error[E0658]: macro invocations in `extern {}` blocks are experimental
--> $DIR/feature-gate-macros_in_extern.rs:19:5
|
LL | returns_isize!(rust_get_test_int);
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
= note: for more information, see https://github.com/rust-lang/rust/issues/49476
= help: add `#![feature(macros_in_extern)]` to the crate attributes to enable
error[E0658]: macro invocations in `extern {}` blocks are experimental
--> $DIR/feature-gate-macros_in_extern.rs:21:5
|
LL | takes_u32_returns_u32!(rust_dbg_extern_identity_u32);
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
= note: for more information, see https://github.com/rust-lang/rust/issues/49476
= help: add `#![feature(macros_in_extern)]` to the crate attributes to enable
error[E0658]: macro invocations in `extern {}` blocks are experimental
--> $DIR/feature-gate-macros_in_extern.rs:23:5
|
LL | emits_nothing!();
| ^^^^^^^^^^^^^^^^^
|
= note: for more information, see https://github.com/rust-lang/rust/issues/49476
= help: add `#![feature(macros_in_extern)]` to the crate attributes to enable
error: aborting due to 3 previous errors
For more information about this error, try `rustc --explain E0658`.

View File

@ -25,7 +25,7 @@ LL | ().clone()
| ^^^^^ method not found in `()`
|
= help: items from traits can only be used if the trait is in scope
= note: the following trait is implemented but not in scope, perhaps add a `use` for it:
= note: the following trait is implemented but not in scope; perhaps add a `use` for it:
`use std::clone::Clone;`
error: aborting due to 3 previous errors

View File

@ -8,7 +8,7 @@ LL | pub macro m() { ().f() }
| ^ method not found in `()`
|
= help: items from traits can only be used if the trait is in scope
= note: the following trait is implemented but not in scope, perhaps add a `use` for it:
= note: the following trait is implemented but not in scope; perhaps add a `use` for it:
`use foo::T;`
error: aborting due to previous error

View File

@ -5,7 +5,7 @@ LL | 1u32.method();
| ^^^^^^ method not found in `u32`
|
= help: items from traits can only be used if the trait is in scope
help: the following traits are implemented but not in scope, perhaps add a `use` for one of them:
help: the following traits are implemented but not in scope; perhaps add a `use` for one of them:
|
LL | use foo::Bar;
|
@ -23,7 +23,7 @@ LL | std::rc::Rc::new(&mut Box::new(&1u32)).method();
| ^^^^^^ method not found in `std::rc::Rc<&mut std::boxed::Box<&u32>>`
|
= help: items from traits can only be used if the trait is in scope
help: the following traits are implemented but not in scope, perhaps add a `use` for one of them:
help: the following traits are implemented but not in scope; perhaps add a `use` for one of them:
|
LL | use foo::Bar;
|
@ -41,7 +41,7 @@ LL | 'a'.method();
| ^^^^^^ method not found in `char`
|
= help: items from traits can only be used if the trait is in scope
help: the following trait is implemented but not in scope, perhaps add a `use` for it:
help: the following trait is implemented but not in scope; perhaps add a `use` for it:
|
LL | use foo::Bar;
|
@ -61,7 +61,7 @@ LL | std::rc::Rc::new(&mut Box::new(&'a')).method();
| ^^^^^^ method not found in `std::rc::Rc<&mut std::boxed::Box<&char>>`
|
= help: items from traits can only be used if the trait is in scope
help: the following trait is implemented but not in scope, perhaps add a `use` for it:
help: the following trait is implemented but not in scope; perhaps add a `use` for it:
|
LL | use foo::Bar;
|
@ -73,7 +73,7 @@ LL | 1i32.method();
| ^^^^^^ method not found in `i32`
|
= help: items from traits can only be used if the trait is in scope
help: the following trait is implemented but not in scope, perhaps add a `use` for it:
help: the following trait is implemented but not in scope; perhaps add a `use` for it:
|
LL | use no_method_suggested_traits::foo::PubPub;
|
@ -85,7 +85,7 @@ LL | std::rc::Rc::new(&mut Box::new(&1i32)).method();
| ^^^^^^ method not found in `std::rc::Rc<&mut std::boxed::Box<&i32>>`
|
= help: items from traits can only be used if the trait is in scope
help: the following trait is implemented but not in scope, perhaps add a `use` for it:
help: the following trait is implemented but not in scope; perhaps add a `use` for it:
|
LL | use no_method_suggested_traits::foo::PubPub;
|

View File

@ -5,7 +5,7 @@ LL | b.foo();
| ^^^ method not found in `&b::B`
|
= help: items from traits can only be used if the trait is in scope
= note: the following trait is implemented but not in scope, perhaps add a `use` for it:
= note: the following trait is implemented but not in scope; perhaps add a `use` for it:
`use a::A;`
error: aborting due to previous error

View File

@ -8,7 +8,7 @@ help: possible better candidate is found in another module, you can import it in
LL | use std::hash::Hash;
|
warning: default bound relaxed for a type parameter, but this does nothing because the given bound is not a default. Only `?Sized` is supported
warning: default bound relaxed for a type parameter, but this does nothing because the given bound is not a default; only `?Sized` is supported
--> $DIR/issue-37534.rs:1:12
|
LL | struct Foo<T: ?Hash> { }

View File

@ -5,7 +5,7 @@ LL | Command::new("echo").arg("hello").exec();
| ^^^^ method not found in `&mut std::process::Command`
|
= help: items from traits can only be used if the trait is in scope
help: the following trait is implemented but not in scope, perhaps add a `use` for it:
help: the following trait is implemented but not in scope; perhaps add a `use` for it:
|
LL | use std::os::unix::process::CommandExt;
|

View File

@ -5,7 +5,7 @@ LL | ().a();
| ^ method not found in `()`
|
= help: items from traits can only be used if the trait is in scope
help: the following trait is implemented but not in scope, perhaps add a `use` for it:
help: the following trait is implemented but not in scope; perhaps add a `use` for it:
|
LL | use xcrate_issue_43189_b::xcrate_issue_43189_a::A;
|

View File

@ -0,0 +1,6 @@
trait Foo {
fn foo([a, b]: [i32; 2]) {}
//~^ ERROR: patterns aren't allowed in methods without bodies
}
fn main() {}

View File

@ -0,0 +1,13 @@
error[E0642]: patterns aren't allowed in methods without bodies
--> $DIR/issue-50571.rs:2:12
|
LL | fn foo([a, b]: [i32; 2]) {}
| ^^^^^^
help: give this argument a name or use an underscore to ignore it
|
LL | fn foo(_: [i32; 2]) {}
| ^
error: aborting due to previous error
For more information about this error, try `rustc --explain E0642`.

View File

@ -0,0 +1,18 @@
pub trait Foo: Sized {
const SIZE: usize;
fn new(slice: &[u8; Foo::SIZE]) -> Self;
//~^ ERROR: type annotations needed: cannot resolve `_: Foo`
}
pub struct Bar<T: ?Sized>(T);
impl Bar<[u8]> {
const SIZE: usize = 32;
fn new(slice: &[u8; Self::SIZE]) -> Self {
Foo(Box::new(*slice)) //~ ERROR: expected function, found trait `Foo`
}
}
fn main() {}

View File

@ -0,0 +1,19 @@
error[E0423]: expected function, found trait `Foo`
--> $DIR/issue-58022.rs:14:9
|
LL | Foo(Box::new(*slice))
| ^^^ not a function
error[E0283]: type annotations needed: cannot resolve `_: Foo`
--> $DIR/issue-58022.rs:4:25
|
LL | const SIZE: usize;
| ------------------ required by `Foo::SIZE`
LL |
LL | fn new(slice: &[u8; Foo::SIZE]) -> Self;
| ^^^^^^^^^
error: aborting due to 2 previous errors
Some errors have detailed explanations: E0283, E0423.
For more information about an error, try `rustc --explain E0283`.

View File

@ -0,0 +1,50 @@
use std::ops::Add;
trait Trait<T> {
fn get(self) -> T;
}
struct Holder<T>(T);
impl<T> Trait<T> for Holder<T> {
fn get(self) -> T {
self.0
}
}
enum Either<L, R> {
Left(L),
Right(R),
}
impl<L, R> Either<L, R> {
fn converge<T>(self) -> T where L: Trait<T>, R: Trait<T> {
match self {
Either::Left(val) => val.get(),
Either::Right(val) => val.get(),
}
}
}
fn add_generic<A: Add<B>, B>(lhs: A, rhs: B) -> Either<
impl Trait<<A as Add<B>>::Output>,
impl Trait<<A as Add<B>>::Output>
> {
if true {
Either::Left(Holder(lhs + rhs))
} else {
Either::Right(Holder(lhs + rhs))
}
}
fn add_one(
value: u32,
) -> Either<impl Trait<<u32 as Add<u32>>::Output>, impl Trait<<u32 as Add<u32>>::Output>> {
//~^ ERROR: the trait bound `impl Trait<<u32 as std::ops::Add>::Output>: Trait<u32>`
//~| ERROR: the trait bound `impl Trait<<u32 as std::ops::Add>::Output>: Trait<u32>`
add_generic(value, 1u32)
}
pub fn main() {
add_one(3).converge();
}

View File

@ -0,0 +1,19 @@
error[E0277]: the trait bound `impl Trait<<u32 as std::ops::Add>::Output>: Trait<u32>` is not satisfied
--> $DIR/issue-58344.rs:42:13
|
LL | ) -> Either<impl Trait<<u32 as Add<u32>>::Output>, impl Trait<<u32 as Add<u32>>::Output>> {
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ the trait `Trait<u32>` is not implemented for `impl Trait<<u32 as std::ops::Add>::Output>`
|
= note: the return type of a function must have a statically known size
error[E0277]: the trait bound `impl Trait<<u32 as std::ops::Add>::Output>: Trait<u32>` is not satisfied
--> $DIR/issue-58344.rs:42:52
|
LL | ) -> Either<impl Trait<<u32 as Add<u32>>::Output>, impl Trait<<u32 as Add<u32>>::Output>> {
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ the trait `Trait<u32>` is not implemented for `impl Trait<<u32 as std::ops::Add>::Output>`
|
= note: the return type of a function must have a statically known size
error: aborting due to 2 previous errors
For more information about this error, try `rustc --explain E0277`.

View File

@ -1,5 +1,3 @@
#![feature(macros_in_extern)]
macro_rules! m {
() => {
let //~ ERROR expected

View File

@ -1,5 +1,5 @@
error: expected one of `crate`, `fn`, `pub`, `static`, or `type`, found `let`
--> $DIR/issue-54441.rs:5:9
--> $DIR/issue-54441.rs:3:9
|
LL | let
| ^^^ unexpected token

View File

@ -252,12 +252,6 @@ test_path!(::std);
test_path!(std::u8,);
test_path!(any, super, super::super::self::path, X<Y>::Z<'a, T=U>);
macro_rules! test_meta_block {
($($m:meta)* $b:block) => {};
}
test_meta_block!(windows {});
macro_rules! test_lifetime {
(1. $($l:lifetime)* $($b:block)*) => {};
(2. $($b:block)* $($l:lifetime)*) => {};

View File

@ -0,0 +1,11 @@
// check-pass
macro_rules! check { ($meta:meta) => () }
check!(meta(a b c d));
check!(meta[a b c d]);
check!(meta { a b c d });
check!(meta);
check!(meta = 0);
fn main() {}

View File

@ -1,30 +0,0 @@
// run-pass
// ignore-wasm32
#![feature(decl_macro, macros_in_extern)]
macro_rules! returns_isize(
($ident:ident) => (
fn $ident() -> isize;
)
);
macro takes_u32_returns_u32($ident:ident) {
fn $ident (arg: u32) -> u32;
}
macro_rules! emits_nothing(
() => ()
);
fn main() {
assert_eq!(unsafe { rust_get_test_int() }, 1isize);
assert_eq!(unsafe { rust_dbg_extern_identity_u32(0xDEADBEEF) }, 0xDEADBEEFu32);
}
#[link(name = "rust_test_helpers", kind = "static")]
extern {
returns_isize!(rust_get_test_int);
takes_u32_returns_u32!(rust_dbg_extern_identity_u32);
emits_nothing!();
}

View File

@ -1,3 +1,4 @@
// run-pass
// ignore-wasm32
#![feature(decl_macro)]
@ -16,17 +17,29 @@ macro_rules! emits_nothing(
() => ()
);
macro_rules! emits_multiple(
() => {
fn f1() -> u32;
fn f2() -> u32;
}
);
mod defs {
#[no_mangle] extern fn f1() -> u32 { 1 }
#[no_mangle] extern fn f2() -> u32 { 2 }
}
fn main() {
assert_eq!(unsafe { rust_get_test_int() }, 0isize);
assert_eq!(unsafe { rust_get_test_int() }, 1);
assert_eq!(unsafe { rust_dbg_extern_identity_u32(0xDEADBEEF) }, 0xDEADBEEFu32);
assert_eq!(unsafe { f1() }, 1);
assert_eq!(unsafe { f2() }, 2);
}
#[link(name = "rust_test_helpers", kind = "static")]
extern {
returns_isize!(rust_get_test_int);
//~^ ERROR macro invocations in `extern {}` blocks are experimental
takes_u32_returns_u32!(rust_dbg_extern_identity_u32);
//~^ ERROR macro invocations in `extern {}` blocks are experimental
emits_nothing!();
//~^ ERROR macro invocations in `extern {}` blocks are experimental
emits_multiple!();
}

View File

@ -34,7 +34,7 @@ error[E0203]: type parameter has more than one relaxed default bound, only one i
LL | struct S5<T>(*const T) where T: ?Trait<'static> + ?Sized;
| ^
warning: default bound relaxed for a type parameter, but this does nothing because the given bound is not a default. Only `?Sized` is supported
warning: default bound relaxed for a type parameter, but this does nothing because the given bound is not a default; only `?Sized` is supported
--> $DIR/maybe-bounds-where.rs:15:11
|
LL | struct S5<T>(*const T) where T: ?Trait<'static> + ?Sized;

View File

@ -7,5 +7,4 @@ fn main() {
once::<&str>("str").fuse().filter(|a: &str| true).count();
//~^ ERROR no method named `count`
//~| ERROR type mismatch in closure arguments
//~| ERROR type mismatch in closure arguments
}

View File

@ -16,14 +16,6 @@ LL | once::<&str>("str").fuse().filter(|a: &str| true).count();
| |
| expected signature of `for<'r> fn(&'r &str) -> _`
error[E0631]: type mismatch in closure arguments
--> $DIR/issue-36053-2.rs:7:32
|
LL | once::<&str>("str").fuse().filter(|a: &str| true).count();
| ^^^^^^ -------------- found signature of `for<'r> fn(&'r str) -> _`
| |
| expected signature of `fn(&&str) -> _`
error: aborting due to 3 previous errors
error: aborting due to 2 previous errors
For more information about this error, try `rustc --explain E0599`.

View File

@ -0,0 +1,15 @@
// In this regression test we check that a trailing `|` in an or-pattern just
// before the `if` token of a `match` guard will receive parser recovery with
// an appropriate error message.
enum E { A, B }
fn main() {
match E::A {
E::A |
E::B | //~ ERROR a trailing `|` is not allowed in an or-pattern
if true => {
let recovery_witness: bool = 0; //~ ERROR mismatched types
}
}
}

View File

@ -0,0 +1,20 @@
error: a trailing `|` is not allowed in an or-pattern
--> $DIR/issue-64879-trailing-before-guard.rs:10:14
|
LL | E::A |
| ---- while parsing this or-pattern starting here
LL | E::B |
| ^ help: remove the `|`
error[E0308]: mismatched types
--> $DIR/issue-64879-trailing-before-guard.rs:12:42
|
LL | let recovery_witness: bool = 0;
| ^ expected bool, found integer
|
= note: expected type `bool`
found type `{integer}`
error: aborting due to 2 previous errors
For more information about this error, try `rustc --explain E0308`.

View File

@ -2,37 +2,49 @@ error: unexpected token `||` after pattern
--> $DIR/multiple-pattern-typo.rs:8:15
|
LL | 1 | 2 || 3 => (),
| ^^ help: use a single `|` to separate multiple alternative patterns: `|`
| - ^^ help: use a single `|` to separate multiple alternative patterns: `|`
| |
| while parsing this or-pattern starting here
error: unexpected token `||` after pattern
--> $DIR/multiple-pattern-typo.rs:13:16
|
LL | (1 | 2 || 3) => (),
| ^^ help: use a single `|` to separate multiple alternative patterns: `|`
| - ^^ help: use a single `|` to separate multiple alternative patterns: `|`
| |
| while parsing this or-pattern starting here
error: unexpected token `||` after pattern
--> $DIR/multiple-pattern-typo.rs:18:16
|
LL | (1 | 2 || 3,) => (),
| ^^ help: use a single `|` to separate multiple alternative patterns: `|`
| - ^^ help: use a single `|` to separate multiple alternative patterns: `|`
| |
| while parsing this or-pattern starting here
error: unexpected token `||` after pattern
--> $DIR/multiple-pattern-typo.rs:25:18
|
LL | TS(1 | 2 || 3) => (),
| ^^ help: use a single `|` to separate multiple alternative patterns: `|`
| - ^^ help: use a single `|` to separate multiple alternative patterns: `|`
| |
| while parsing this or-pattern starting here
error: unexpected token `||` after pattern
--> $DIR/multiple-pattern-typo.rs:32:23
|
LL | NS { f: 1 | 2 || 3 } => (),
| ^^ help: use a single `|` to separate multiple alternative patterns: `|`
| - ^^ help: use a single `|` to separate multiple alternative patterns: `|`
| |
| while parsing this or-pattern starting here
error: unexpected token `||` after pattern
--> $DIR/multiple-pattern-typo.rs:37:16
|
LL | [1 | 2 || 3] => (),
| ^^ help: use a single `|` to separate multiple alternative patterns: `|`
| - ^^ help: use a single `|` to separate multiple alternative patterns: `|`
| |
| while parsing this or-pattern starting here
error: unexpected token `||` after pattern
--> $DIR/multiple-pattern-typo.rs:42:9

View File

@ -51,24 +51,32 @@ error: a leading `|` is only allowed in a top-level pattern
|
LL | let ( || A | B) = E::A;
| ^^ help: remove the `||`
|
= note: alternatives in or-patterns are separated with `|`, not `||`
error: a leading `|` is only allowed in a top-level pattern
--> $DIR/or-patterns-syntactic-fail.rs:48:11
|
LL | let [ || A | B ] = [E::A];
| ^^ help: remove the `||`
|
= note: alternatives in or-patterns are separated with `|`, not `||`
error: a leading `|` is only allowed in a top-level pattern
--> $DIR/or-patterns-syntactic-fail.rs:49:13
|
LL | let TS( || A | B );
| ^^ help: remove the `||`
|
= note: alternatives in or-patterns are separated with `|`, not `||`
error: a leading `|` is only allowed in a top-level pattern
--> $DIR/or-patterns-syntactic-fail.rs:50:17
|
LL | let NS { f: || A | B };
| ^^ help: remove the `||`
|
= note: alternatives in or-patterns are separated with `|`, not `||`
error: no rules expected the token `|`
--> $DIR/or-patterns-syntactic-fail.rs:14:15

View File

@ -1,4 +1,4 @@
// Test the suggestion to remove a leading `|`.
// Test the suggestion to remove a leading, or trailing `|`.
// run-rustfix
@ -8,7 +8,7 @@
fn main() {}
#[cfg(FALSE)]
fn leading_vert() {
fn leading() {
fn fun1( A: E) {} //~ ERROR a leading `|` is not allowed in a parameter pattern
fn fun2( A: E) {} //~ ERROR a leading `|` is not allowed in a parameter pattern
let ( A): E; //~ ERROR a leading `|` is only allowed in a top-level pattern
@ -21,3 +21,26 @@ fn leading_vert() {
let NS { f: A }: NS; //~ ERROR a leading `|` is only allowed in a top-level pattern
let NS { f: A }: NS; //~ ERROR a leading `|` is only allowed in a top-level pattern
}
#[cfg(FALSE)]
fn trailing() {
let ( A ): E; //~ ERROR a trailing `|` is not allowed in an or-pattern
let (a ,): (E,); //~ ERROR a trailing `|` is not allowed in an or-pattern
let ( A | B ): E; //~ ERROR a trailing `|` is not allowed in an or-pattern
let [ A | B ]: [E; 1]; //~ ERROR a trailing `|` is not allowed in an or-pattern
let S { f: B }; //~ ERROR a trailing `|` is not allowed in an or-pattern
let ( A | B ): E; //~ ERROR unexpected token `||` after pattern
//~^ ERROR a trailing `|` is not allowed in an or-pattern
match A {
A => {} //~ ERROR a trailing `|` is not allowed in an or-pattern
A => {} //~ ERROR a trailing `|` is not allowed in an or-pattern
A | B => {} //~ ERROR unexpected token `||` after pattern
//~^ ERROR a trailing `|` is not allowed in an or-pattern
| A | B => {}
//~^ ERROR a trailing `|` is not allowed in an or-pattern
}
let a : u8 = 0; //~ ERROR a trailing `|` is not allowed in an or-pattern
let a = 0; //~ ERROR a trailing `|` is not allowed in an or-pattern
let a ; //~ ERROR a trailing `|` is not allowed in an or-pattern
}

View File

@ -1,4 +1,4 @@
// Test the suggestion to remove a leading `|`.
// Test the suggestion to remove a leading, or trailing `|`.
// run-rustfix
@ -8,7 +8,7 @@
fn main() {}
#[cfg(FALSE)]
fn leading_vert() {
fn leading() {
fn fun1( | A: E) {} //~ ERROR a leading `|` is not allowed in a parameter pattern
fn fun2( || A: E) {} //~ ERROR a leading `|` is not allowed in a parameter pattern
let ( | A): E; //~ ERROR a leading `|` is only allowed in a top-level pattern
@ -21,3 +21,26 @@ fn leading_vert() {
let NS { f: | A }: NS; //~ ERROR a leading `|` is only allowed in a top-level pattern
let NS { f: || A }: NS; //~ ERROR a leading `|` is only allowed in a top-level pattern
}
#[cfg(FALSE)]
fn trailing() {
let ( A | ): E; //~ ERROR a trailing `|` is not allowed in an or-pattern
let (a |,): (E,); //~ ERROR a trailing `|` is not allowed in an or-pattern
let ( A | B | ): E; //~ ERROR a trailing `|` is not allowed in an or-pattern
let [ A | B | ]: [E; 1]; //~ ERROR a trailing `|` is not allowed in an or-pattern
let S { f: B | }; //~ ERROR a trailing `|` is not allowed in an or-pattern
let ( A || B | ): E; //~ ERROR unexpected token `||` after pattern
//~^ ERROR a trailing `|` is not allowed in an or-pattern
match A {
A | => {} //~ ERROR a trailing `|` is not allowed in an or-pattern
A || => {} //~ ERROR a trailing `|` is not allowed in an or-pattern
A || B | => {} //~ ERROR unexpected token `||` after pattern
//~^ ERROR a trailing `|` is not allowed in an or-pattern
| A | B | => {}
//~^ ERROR a trailing `|` is not allowed in an or-pattern
}
let a | : u8 = 0; //~ ERROR a trailing `|` is not allowed in an or-pattern
let a | = 0; //~ ERROR a trailing `|` is not allowed in an or-pattern
let a | ; //~ ERROR a trailing `|` is not allowed in an or-pattern
}

View File

@ -9,6 +9,8 @@ error: a leading `|` is not allowed in a parameter pattern
|
LL | fn fun2( || A: E) {}
| ^^ help: remove the `||`
|
= note: alternatives in or-patterns are separated with `|`, not `||`
error: a leading `|` is only allowed in a top-level pattern
--> $DIR/remove-leading-vert.rs:14:11
@ -21,6 +23,8 @@ error: a leading `|` is only allowed in a top-level pattern
|
LL | let ( || A): (E);
| ^^ help: remove the `||`
|
= note: alternatives in or-patterns are separated with `|`, not `||`
error: a leading `|` is only allowed in a top-level pattern
--> $DIR/remove-leading-vert.rs:16:11
@ -39,6 +43,8 @@ error: a leading `|` is only allowed in a top-level pattern
|
LL | let [ || A ]: [E; 1];
| ^^ help: remove the `||`
|
= note: alternatives in or-patterns are separated with `|`, not `||`
error: a leading `|` is only allowed in a top-level pattern
--> $DIR/remove-leading-vert.rs:19:13
@ -51,6 +57,8 @@ error: a leading `|` is only allowed in a top-level pattern
|
LL | let TS( || A ): TS;
| ^^ help: remove the `||`
|
= note: alternatives in or-patterns are separated with `|`, not `||`
error: a leading `|` is only allowed in a top-level pattern
--> $DIR/remove-leading-vert.rs:21:17
@ -63,6 +71,130 @@ error: a leading `|` is only allowed in a top-level pattern
|
LL | let NS { f: || A }: NS;
| ^^ help: remove the `||`
|
= note: alternatives in or-patterns are separated with `|`, not `||`
error: aborting due to 11 previous errors
error: a trailing `|` is not allowed in an or-pattern
--> $DIR/remove-leading-vert.rs:27:13
|
LL | let ( A | ): E;
| - ^ help: remove the `|`
| |
| while parsing this or-pattern starting here
error: a trailing `|` is not allowed in an or-pattern
--> $DIR/remove-leading-vert.rs:28:12
|
LL | let (a |,): (E,);
| - ^ help: remove the `|`
| |
| while parsing this or-pattern starting here
error: a trailing `|` is not allowed in an or-pattern
--> $DIR/remove-leading-vert.rs:29:17
|
LL | let ( A | B | ): E;
| - ^ help: remove the `|`
| |
| while parsing this or-pattern starting here
error: a trailing `|` is not allowed in an or-pattern
--> $DIR/remove-leading-vert.rs:30:17
|
LL | let [ A | B | ]: [E; 1];
| - ^ help: remove the `|`
| |
| while parsing this or-pattern starting here
error: a trailing `|` is not allowed in an or-pattern
--> $DIR/remove-leading-vert.rs:31:18
|
LL | let S { f: B | };
| - ^ help: remove the `|`
| |
| while parsing this or-pattern starting here
error: unexpected token `||` after pattern
--> $DIR/remove-leading-vert.rs:32:13
|
LL | let ( A || B | ): E;
| - ^^ help: use a single `|` to separate multiple alternative patterns: `|`
| |
| while parsing this or-pattern starting here
error: a trailing `|` is not allowed in an or-pattern
--> $DIR/remove-leading-vert.rs:32:18
|
LL | let ( A || B | ): E;
| - ^ help: remove the `|`
| |
| while parsing this or-pattern starting here
error: a trailing `|` is not allowed in an or-pattern
--> $DIR/remove-leading-vert.rs:35:11
|
LL | A | => {}
| - ^ help: remove the `|`
| |
| while parsing this or-pattern starting here
error: a trailing `|` is not allowed in an or-pattern
--> $DIR/remove-leading-vert.rs:36:11
|
LL | A || => {}
| - ^^ help: remove the `||`
| |
| while parsing this or-pattern starting here
|
= note: alternatives in or-patterns are separated with `|`, not `||`
error: unexpected token `||` after pattern
--> $DIR/remove-leading-vert.rs:37:11
|
LL | A || B | => {}
| - ^^ help: use a single `|` to separate multiple alternative patterns: `|`
| |
| while parsing this or-pattern starting here
error: a trailing `|` is not allowed in an or-pattern
--> $DIR/remove-leading-vert.rs:37:16
|
LL | A || B | => {}
| - ^ help: remove the `|`
| |
| while parsing this or-pattern starting here
error: a trailing `|` is not allowed in an or-pattern
--> $DIR/remove-leading-vert.rs:39:17
|
LL | | A | B | => {}
| - ^ help: remove the `|`
| |
| while parsing this or-pattern starting here
error: a trailing `|` is not allowed in an or-pattern
--> $DIR/remove-leading-vert.rs:43:11
|
LL | let a | : u8 = 0;
| - ^ help: remove the `|`
| |
| while parsing this or-pattern starting here
error: a trailing `|` is not allowed in an or-pattern
--> $DIR/remove-leading-vert.rs:44:11
|
LL | let a | = 0;
| - ^ help: remove the `|`
| |
| while parsing this or-pattern starting here
error: a trailing `|` is not allowed in an or-pattern
--> $DIR/remove-leading-vert.rs:45:11
|
LL | let a | ;
| - ^ help: remove the `|`
| |
| while parsing this or-pattern starting here
error: aborting due to 26 previous errors

View File

@ -1,26 +0,0 @@
// force-host
// no-prefer-dynamic
#![crate_type = "proc-macro"]
extern crate proc_macro;
use proc_macro::TokenStream;
#[proc_macro_attribute]
pub fn nop_attr(_attr: TokenStream, input: TokenStream) -> TokenStream {
assert!(_attr.to_string().is_empty());
input
}
#[proc_macro_attribute]
pub fn no_output(_attr: TokenStream, _input: TokenStream) -> TokenStream {
assert!(_attr.to_string().is_empty());
assert!(!_input.to_string().is_empty());
"".parse().unwrap()
}
#[proc_macro]
pub fn emit_input(input: TokenStream) -> TokenStream {
input
}

View File

@ -7,8 +7,6 @@
// normalize-stdout-test "bytes\([^0]\w*\.\.(\w+)\)" -> "bytes(LO..$1)"
// normalize-stdout-test "bytes\((\w+)\.\.[^0]\w*\)" -> "bytes($1..HI)"
#![feature(proc_macro_hygiene)]
#[macro_use]
extern crate test_macros;
extern crate dollar_crate_external;

View File

@ -1,7 +1,5 @@
// aux-build:lifetimes.rs
#![feature(proc_macro_hygiene)]
extern crate lifetimes;
use lifetimes::*;

View File

@ -1,5 +1,5 @@
error: expected type, found `'`
--> $DIR/lifetimes.rs:9:10
--> $DIR/lifetimes.rs:7:10
|
LL | type A = single_quote_alone!();
| ^^^^^^^^^^^^^^^^^^^^^ this macro call doesn't expand to a type

View File

@ -0,0 +1,6 @@
extern {
#[derive(Copy)] //~ ERROR `derive` may only be applied to structs, enums and unions
fn f();
}
fn main() {}

View File

@ -0,0 +1,8 @@
error: `derive` may only be applied to structs, enums and unions
--> $DIR/macros-in-extern-derive.rs:2:5
|
LL | #[derive(Copy)]
| ^^^^^^^^^^^^^^^
error: aborting due to previous error

View File

@ -1,25 +0,0 @@
// run-pass
// aux-build:test-macros-rpass.rs
// ignore-wasm32
#![feature(macros_in_extern)]
extern crate test_macros_rpass as test_macros;
use test_macros::{nop_attr, no_output, emit_input};
fn main() {
assert_eq!(unsafe { rust_get_test_int() }, 1isize);
assert_eq!(unsafe { rust_dbg_extern_identity_u32(0xDEADBEEF) }, 0xDEADBEEF);
}
#[link(name = "rust_test_helpers", kind = "static")]
extern {
#[no_output]
fn some_definitely_unknown_symbol_which_should_be_removed();
#[nop_attr]
fn rust_get_test_int() -> isize;
emit_input!(fn rust_dbg_extern_identity_u32(arg: u32) -> u32;);
}

View File

@ -1,3 +1,4 @@
// run-pass
// aux-build:test-macros.rs
// ignore-wasm32
@ -5,20 +6,17 @@
extern crate test_macros;
fn main() {
assert_eq!(unsafe { rust_get_test_int() }, 0isize);
assert_eq!(unsafe { rust_get_test_int() }, 1);
assert_eq!(unsafe { rust_dbg_extern_identity_u32(0xDEADBEEF) }, 0xDEADBEEF);
}
#[link(name = "rust_test_helpers", kind = "static")]
extern {
#[empty_attr]
//~^ ERROR macro invocations in `extern {}` blocks are experimental
fn some_definitely_unknown_symbol_which_should_be_removed();
#[identity_attr]
//~^ ERROR macro invocations in `extern {}` blocks are experimental
fn rust_get_test_int() -> isize;
identity!(fn rust_dbg_extern_identity_u32(arg: u32) -> u32;);
//~^ ERROR macro invocations in `extern {}` blocks are experimental
}

View File

@ -0,0 +1,11 @@
// check-pass
// aux-build:test-macros.rs
#[macro_use]
extern crate test_macros;
const C: identity!(u8) = 10;
fn main() {
let c: u8 = C;
}

View File

@ -50,7 +50,6 @@ fn attrs() {
}
fn main() {
let _x: identity!(u32) = 3; //~ ERROR: procedural macros cannot be expanded to types
if let identity!(Some(_x)) = Some(3) {}
//~^ ERROR: procedural macros cannot be expanded to patterns

View File

@ -94,17 +94,8 @@ LL | let _x = #[identity_attr] println!();
= note: for more information, see https://github.com/rust-lang/rust/issues/54727
= help: add `#![feature(proc_macro_hygiene)]` to the crate attributes to enable
error[E0658]: procedural macros cannot be expanded to types
--> $DIR/proc-macro-gates.rs:53:13
|
LL | let _x: identity!(u32) = 3;
| ^^^^^^^^^^^^^^
|
= note: for more information, see https://github.com/rust-lang/rust/issues/54727
= help: add `#![feature(proc_macro_hygiene)]` to the crate attributes to enable
error[E0658]: procedural macros cannot be expanded to patterns
--> $DIR/proc-macro-gates.rs:54:12
--> $DIR/proc-macro-gates.rs:53:12
|
LL | if let identity!(Some(_x)) = Some(3) {}
| ^^^^^^^^^^^^^^^^^^^
@ -113,7 +104,7 @@ LL | if let identity!(Some(_x)) = Some(3) {}
= help: add `#![feature(proc_macro_hygiene)]` to the crate attributes to enable
error[E0658]: procedural macros cannot be expanded to statements
--> $DIR/proc-macro-gates.rs:57:5
--> $DIR/proc-macro-gates.rs:56:5
|
LL | empty!(struct S;);
| ^^^^^^^^^^^^^^^^^^
@ -122,7 +113,7 @@ LL | empty!(struct S;);
= help: add `#![feature(proc_macro_hygiene)]` to the crate attributes to enable
error[E0658]: procedural macros cannot be expanded to statements
--> $DIR/proc-macro-gates.rs:58:5
--> $DIR/proc-macro-gates.rs:57:5
|
LL | empty!(let _x = 3;);
| ^^^^^^^^^^^^^^^^^^^^
@ -131,7 +122,7 @@ LL | empty!(let _x = 3;);
= help: add `#![feature(proc_macro_hygiene)]` to the crate attributes to enable
error[E0658]: procedural macros cannot be expanded to expressions
--> $DIR/proc-macro-gates.rs:60:14
--> $DIR/proc-macro-gates.rs:59:14
|
LL | let _x = identity!(3);
| ^^^^^^^^^^^^
@ -140,7 +131,7 @@ LL | let _x = identity!(3);
= help: add `#![feature(proc_macro_hygiene)]` to the crate attributes to enable
error[E0658]: procedural macros cannot be expanded to expressions
--> $DIR/proc-macro-gates.rs:61:15
--> $DIR/proc-macro-gates.rs:60:15
|
LL | let _x = [empty!(3)];
| ^^^^^^^^^
@ -148,6 +139,6 @@ LL | let _x = [empty!(3)];
= note: for more information, see https://github.com/rust-lang/rust/issues/54727
= help: add `#![feature(proc_macro_hygiene)]` to the crate attributes to enable
error: aborting due to 17 previous errors
error: aborting due to 16 previous errors
For more information about this error, try `rustc --explain E0658`.

View File

@ -5,7 +5,7 @@ LL | x.foobar();
| ^^^^^^ method not found in `u32`
|
= help: items from traits can only be used if the trait is in scope
= note: the following trait is implemented but not in scope, perhaps add a `use` for it:
= note: the following trait is implemented but not in scope; perhaps add a `use` for it:
`use crate::foo::foobar::Foobar;`
error[E0599]: no method named `bar` found for type `u32` in the current scope
@ -15,7 +15,7 @@ LL | x.bar();
| ^^^ method not found in `u32`
|
= help: items from traits can only be used if the trait is in scope
help: the following trait is implemented but not in scope, perhaps add a `use` for it:
help: the following trait is implemented but not in scope; perhaps add a `use` for it:
|
LL | use crate::foo::Bar;
|
@ -33,7 +33,7 @@ LL | let y = u32::from_str("33");
| ^^^^^^^^ function or associated item not found in `u32`
|
= help: items from traits can only be used if the trait is in scope
help: the following trait is implemented but not in scope, perhaps add a `use` for it:
help: the following trait is implemented but not in scope; perhaps add a `use` for it:
|
LL | use std::str::FromStr;
|

View File

@ -5,7 +5,7 @@ LL | ().f()
| ^ method not found in `()`
|
= help: items from traits can only be used if the trait is in scope
help: the following trait is implemented but not in scope, perhaps add a `use` for it:
help: the following trait is implemented but not in scope; perhaps add a `use` for it:
|
LL | use foo::T;
|

Some files were not shown because too many files have changed in this diff Show More