Skip to content

Commit e6947ec

Browse files
committed
Rollup merge of #49597 - alexcrichton:proc-macro-v2, r=petrochenkov
proc_macro: Reorganize public API This commit is a reorganization of the `proc_macro` crate's public user-facing API. This is the result of a number of discussions at the recent Rust All-Hands where we're hoping to get the `proc_macro` crate into ship shape for stabilization of a subset of its functionality in the Rust 2018 release. The reorganization here is motivated by experiences from the `proc-macro2`, `quote`, and `syn` crates on crates.io (and other crates which depend on them). The main focus is future flexibility along with making a few more operations consistent and/or fixing bugs. A summary of the changes made from today's `proc_macro` API is: * The `TokenNode` enum has been removed and the public fields of `TokenTree` have also been removed. Instead the `TokenTree` type is now a public enum (what `TokenNode` was) and each variant is an opaque struct which internally contains `Span` information. This makes the various tokens a bit more consistent, require fewer wrappers, and otherwise provides good future-compatibility as opaque structs are easy to modify later on. * `Literal` integer constructors have been expanded to be unambiguous as to what they're doing and also allow for more future flexibility. Previously constructors like `Literal::float` and `Literal::integer` were used to create unsuffixed literals and the concrete methods like `Literal::i32` would create a suffixed token. This wasn't immediately clear to all users (the suffixed/unsuffixed aspect) and having *one* constructor for unsuffixed literals required us to pick a largest type which may not always be true. To fix these issues all constructors are now of the form `Literal::i32_unsuffixed` or `Literal::i32_suffixed` (for all integral types). This should allow future compatibility as well as being immediately clear what's suffixed and what isn't. * Each variant of `TokenTree` internally contains a `Span` which can also be configured via `set_span`. For example `Literal` and `Term` now both internally contain a `Span` rather than having it stored in an auxiliary location. * Constructors of all tokens are called `new` now (aka `Term::intern` is gone) and most do not take spans. Manufactured tokens typically don't have a fresh span to go with them and the span is purely used for error-reporting **except** the span for `Term`, which currently affects hygiene. The default spans for all these constructed tokens is `Span::call_site()` for now. The `Term` type's constructor explicitly requires passing in a `Span` to provide future-proofing against possible hygiene changes. It's intended that a first pass of stabilization will likely only stabilize `Span::call_site()` which is an explicit opt-in for "I would like no hygiene here please". The intention here is to make this explicit in procedural macros to be forwards-compatible with a hygiene-specifying solution. * Some of the conversions for `TokenStream` have been simplified a little. * The `TokenTreeIter` iterator was renamed to `token_stream::IntoIter`. Overall the hope is that this is the "final pass" at the API of `TokenStream` and most of `TokenTree` before stabilization. Explicitly left out here is any changes to `Span`'s API which will likely need to be re-evaluated before stabilization. All changes in this PR have already been reflected to the [`proc-macro2`], `quote`, and `syn` crates. New versions of all these crates have also been published to crates.io. Once this lands in nightly I plan on making an internals post again summarizing the changes made here and also calling on all macro authors to give the APIs a spin and see how they work. Hopefully pending no major issues we can then have an FCP to stabilize later this cycle! [`proc-macro2`]: https://docs.rs/proc-macro2/0.3.1/proc_macro2/ Closes #49596
2 parents 72ac3eb + a57b1fb commit e6947ec

File tree

10 files changed

+732
-314
lines changed

10 files changed

+732
-314
lines changed

src/libproc_macro/lib.rs

+568-185
Large diffs are not rendered by default.

src/libproc_macro/quote.rs

+83-55
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@
1414
//! This quasiquoter uses macros 2.0 hygiene to reliably access
1515
//! items from `proc_macro`, to build a `proc_macro::TokenStream`.
1616
17-
use {Delimiter, Literal, Spacing, Span, Term, TokenNode, TokenStream, TokenTree};
17+
use {Delimiter, Literal, Spacing, Span, Term, Op, Group, TokenStream, TokenTree};
1818

1919
use syntax::ext::base::{ExtCtxt, ProcMacro};
2020
use syntax::parse::token;
@@ -23,47 +23,59 @@ use syntax::tokenstream;
2323
pub struct Quoter;
2424

2525
pub fn unquote<T: Into<TokenStream> + Clone>(tokens: &T) -> TokenStream {
26-
T::into(tokens.clone())
26+
tokens.clone().into()
2727
}
2828

2929
pub trait Quote {
3030
fn quote(self) -> TokenStream;
3131
}
3232

33+
macro_rules! tt2ts {
34+
($e:expr) => (TokenStream::from(TokenTree::from($e)))
35+
}
36+
3337
macro_rules! quote_tok {
34-
(,) => { TokenNode::Op(',', Spacing::Alone) };
35-
(.) => { TokenNode::Op('.', Spacing::Alone) };
36-
(:) => { TokenNode::Op(':', Spacing::Alone) };
38+
(,) => { tt2ts!(Op::new(',', Spacing::Alone)) };
39+
(.) => { tt2ts!(Op::new('.', Spacing::Alone)) };
40+
(:) => { tt2ts!(Op::new(':', Spacing::Alone)) };
41+
(|) => { tt2ts!(Op::new('|', Spacing::Alone)) };
3742
(::) => {
3843
[
39-
TokenNode::Op(':', Spacing::Joint),
40-
TokenNode::Op(':', Spacing::Alone)
41-
].iter().cloned().collect::<TokenStream>()
44+
TokenTree::from(Op::new(':', Spacing::Joint)),
45+
TokenTree::from(Op::new(':', Spacing::Alone)),
46+
].iter()
47+
.cloned()
48+
.map(|mut x| {
49+
x.set_span(Span::def_site());
50+
x
51+
})
52+
.collect::<TokenStream>()
4253
};
43-
(!) => { TokenNode::Op('!', Spacing::Alone) };
44-
(<) => { TokenNode::Op('<', Spacing::Alone) };
45-
(>) => { TokenNode::Op('>', Spacing::Alone) };
46-
(_) => { TokenNode::Op('_', Spacing::Alone) };
47-
(0) => { TokenNode::Literal(::Literal::integer(0)) };
48-
(&) => { TokenNode::Op('&', Spacing::Alone) };
49-
($i:ident) => { TokenNode::Term(Term::intern(stringify!($i))) };
54+
(!) => { tt2ts!(Op::new('!', Spacing::Alone)) };
55+
(<) => { tt2ts!(Op::new('<', Spacing::Alone)) };
56+
(>) => { tt2ts!(Op::new('>', Spacing::Alone)) };
57+
(_) => { tt2ts!(Op::new('_', Spacing::Alone)) };
58+
(0) => { tt2ts!(Literal::i8_unsuffixed(0)) };
59+
(&) => { tt2ts!(Op::new('&', Spacing::Alone)) };
60+
($i:ident) => { tt2ts!(Term::new(stringify!($i), Span::def_site())) };
5061
}
5162

5263
macro_rules! quote_tree {
5364
((unquote $($t:tt)*)) => { $($t)* };
5465
((quote $($t:tt)*)) => { ($($t)*).quote() };
55-
(($($t:tt)*)) => { TokenNode::Group(Delimiter::Parenthesis, quote!($($t)*)) };
56-
([$($t:tt)*]) => { TokenNode::Group(Delimiter::Bracket, quote!($($t)*)) };
57-
({$($t:tt)*}) => { TokenNode::Group(Delimiter::Brace, quote!($($t)*)) };
66+
(($($t:tt)*)) => { tt2ts!(Group::new(Delimiter::Parenthesis, quote!($($t)*))) };
67+
([$($t:tt)*]) => { tt2ts!(Group::new(Delimiter::Bracket, quote!($($t)*))) };
68+
({$($t:tt)*}) => { tt2ts!(Group::new(Delimiter::Brace, quote!($($t)*))) };
5869
($t:tt) => { quote_tok!($t) };
5970
}
6071

6172
macro_rules! quote {
6273
() => { TokenStream::empty() };
6374
($($t:tt)*) => {
64-
[
65-
$(TokenStream::from(quote_tree!($t)),)*
66-
].iter().cloned().collect::<TokenStream>()
75+
[$(quote_tree!($t),)*].iter()
76+
.cloned()
77+
.flat_map(|x| x.into_iter())
78+
.collect::<TokenStream>()
6779
};
6880
}
6981

@@ -97,72 +109,81 @@ impl Quote for TokenStream {
97109
let tokens = self.into_iter().filter_map(|tree| {
98110
if after_dollar {
99111
after_dollar = false;
100-
match tree.kind {
101-
TokenNode::Term(_) => {
112+
match tree {
113+
TokenTree::Term(_) => {
114+
let tree = TokenStream::from(tree);
102115
return Some(quote!(::__internal::unquote(&(unquote tree)),));
103116
}
104-
TokenNode::Op('$', _) => {}
117+
TokenTree::Op(ref tt) if tt.op() == '$' => {}
105118
_ => panic!("`$` must be followed by an ident or `$` in `quote!`"),
106119
}
107-
} else if let TokenNode::Op('$', _) = tree.kind {
108-
after_dollar = true;
109-
return None;
120+
} else if let TokenTree::Op(tt) = tree {
121+
if tt.op() == '$' {
122+
after_dollar = true;
123+
return None;
124+
}
110125
}
111126

112127
Some(quote!(::TokenStream::from((quote tree)),))
113-
}).collect::<TokenStream>();
128+
}).flat_map(|t| t.into_iter()).collect::<TokenStream>();
114129

115130
if after_dollar {
116131
panic!("unexpected trailing `$` in `quote!`");
117132
}
118133

119-
quote!([(unquote tokens)].iter().cloned().collect::<::TokenStream>())
134+
quote!(
135+
[(unquote tokens)].iter()
136+
.cloned()
137+
.flat_map(|x| x.into_iter())
138+
.collect::<::TokenStream>()
139+
)
120140
}
121141
}
122142

123143
impl Quote for TokenTree {
124144
fn quote(self) -> TokenStream {
125-
quote!(::TokenTree { span: (quote self.span), kind: (quote self.kind) })
145+
match self {
146+
TokenTree::Op(tt) => quote!(::TokenTree::Op( (quote tt) )),
147+
TokenTree::Group(tt) => quote!(::TokenTree::Group( (quote tt) )),
148+
TokenTree::Term(tt) => quote!(::TokenTree::Term( (quote tt) )),
149+
TokenTree::Literal(tt) => quote!(::TokenTree::Literal( (quote tt) )),
150+
}
126151
}
127152
}
128153

129-
impl Quote for TokenNode {
154+
impl Quote for char {
130155
fn quote(self) -> TokenStream {
131-
macro_rules! gen_match {
132-
($($i:ident($($arg:ident),+)),*) => {
133-
match self {
134-
$(TokenNode::$i($($arg),+) => quote! {
135-
::TokenNode::$i($((quote $arg)),+)
136-
},)*
137-
}
138-
}
139-
}
156+
TokenTree::from(Literal::character(self)).into()
157+
}
158+
}
140159

141-
gen_match! { Op(op, kind), Group(delim, tokens), Term(term), Literal(lit) }
160+
impl<'a> Quote for &'a str {
161+
fn quote(self) -> TokenStream {
162+
TokenTree::from(Literal::string(self)).into()
142163
}
143164
}
144165

145-
impl Quote for char {
166+
impl Quote for usize {
146167
fn quote(self) -> TokenStream {
147-
TokenNode::Literal(Literal::character(self)).into()
168+
TokenTree::from(Literal::usize_unsuffixed(self)).into()
148169
}
149170
}
150171

151-
impl<'a> Quote for &'a str {
172+
impl Quote for Group {
152173
fn quote(self) -> TokenStream {
153-
TokenNode::Literal(Literal::string(self)).into()
174+
quote!(::Group::new((quote self.delimiter()), (quote self.stream())))
154175
}
155176
}
156177

157-
impl Quote for usize {
178+
impl Quote for Op {
158179
fn quote(self) -> TokenStream {
159-
TokenNode::Literal(Literal::integer(self as i128)).into()
180+
quote!(::Op::new((quote self.op()), (quote self.spacing())))
160181
}
161182
}
162183

163184
impl Quote for Term {
164185
fn quote(self) -> TokenStream {
165-
quote!(::Term::intern((quote self.as_str())))
186+
quote!(::Term::new((quote self.as_str()), (quote self.span())))
166187
}
167188
}
168189

@@ -182,31 +203,38 @@ macro_rules! literals {
182203
impl LiteralKind {
183204
pub fn with_contents_and_suffix(self, contents: Term, suffix: Option<Term>)
184205
-> Literal {
185-
let contents = contents.0;
186-
let suffix = suffix.map(|t| t.0);
206+
let sym = contents.sym;
207+
let suffix = suffix.map(|t| t.sym);
187208
match self {
188209
$(LiteralKind::$i => {
189-
Literal(token::Literal(token::Lit::$i(contents), suffix))
210+
Literal {
211+
token: token::Literal(token::Lit::$i(sym), suffix),
212+
span: contents.span,
213+
}
190214
})*
191215
$(LiteralKind::$raw(n) => {
192-
Literal(token::Literal(token::Lit::$raw(contents, n), suffix))
216+
Literal {
217+
token: token::Literal(token::Lit::$raw(sym, n), suffix),
218+
span: contents.span,
219+
}
193220
})*
194221
}
195222
}
196223
}
197224

198225
impl Literal {
199226
fn kind_contents_and_suffix(self) -> (LiteralKind, Term, Option<Term>) {
200-
let (lit, suffix) = match self.0 {
227+
let (lit, suffix) = match self.token {
201228
token::Literal(lit, suffix) => (lit, suffix),
202-
_ => panic!("unsupported literal {:?}", self.0),
229+
_ => panic!("unsupported literal {:?}", self.token),
203230
};
204231

205232
let (kind, contents) = match lit {
206233
$(token::Lit::$i(contents) => (LiteralKind::$i, contents),)*
207234
$(token::Lit::$raw(contents, n) => (LiteralKind::$raw(n), contents),)*
208235
};
209-
(kind, Term(contents), suffix.map(Term))
236+
let suffix = suffix.map(|sym| Term::new(&sym.as_str(), self.span()));
237+
(kind, Term::new(&contents.as_str(), self.span()), suffix)
210238
}
211239
}
212240

src/test/compile-fail-fulldeps/proc-macro/auxiliary/attributes-included.rs

+47-40
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@
1616

1717
extern crate proc_macro;
1818

19-
use proc_macro::{TokenStream, TokenTree, TokenNode, Delimiter, Literal, Spacing};
19+
use proc_macro::{TokenStream, TokenTree, Delimiter, Literal, Spacing, Group};
2020

2121
#[proc_macro_attribute]
2222
pub fn foo(attr: TokenStream, input: TokenStream) -> TokenStream {
@@ -52,24 +52,30 @@ pub fn bar(attr: TokenStream, input: TokenStream) -> TokenStream {
5252
}
5353

5454
fn assert_inline(slice: &mut &[TokenTree]) {
55-
match slice[0].kind {
56-
TokenNode::Op('#', _) => {}
55+
match &slice[0] {
56+
TokenTree::Op(tt) => assert_eq!(tt.op(), '#'),
5757
_ => panic!("expected '#' char"),
5858
}
59-
match slice[1].kind {
60-
TokenNode::Group(Delimiter::Bracket, _) => {}
59+
match &slice[1] {
60+
TokenTree::Group(tt) => assert_eq!(tt.delimiter(), Delimiter::Bracket),
6161
_ => panic!("expected brackets"),
6262
}
6363
*slice = &slice[2..];
6464
}
6565

6666
fn assert_doc(slice: &mut &[TokenTree]) {
67-
match slice[0].kind {
68-
TokenNode::Op('#', Spacing::Alone) => {}
67+
match &slice[0] {
68+
TokenTree::Op(tt) => {
69+
assert_eq!(tt.op(), '#');
70+
assert_eq!(tt.spacing(), Spacing::Alone);
71+
}
6972
_ => panic!("expected #"),
7073
}
71-
let inner = match slice[1].kind {
72-
TokenNode::Group(Delimiter::Bracket, ref s) => s.clone(),
74+
let inner = match &slice[1] {
75+
TokenTree::Group(tt) => {
76+
assert_eq!(tt.delimiter(), Delimiter::Bracket);
77+
tt.stream()
78+
}
7379
_ => panic!("expected brackets"),
7480
};
7581
let tokens = inner.into_iter().collect::<Vec<_>>();
@@ -79,49 +85,55 @@ fn assert_doc(slice: &mut &[TokenTree]) {
7985
panic!("expected three tokens in doc")
8086
}
8187

82-
match tokens[0].kind {
83-
TokenNode::Term(ref t) => assert_eq!("doc", t.as_str()),
88+
match &tokens[0] {
89+
TokenTree::Term(tt) => assert_eq!("doc", tt.as_str()),
8490
_ => panic!("expected `doc`"),
8591
}
86-
match tokens[1].kind {
87-
TokenNode::Op('=', Spacing::Alone) => {}
92+
match &tokens[1] {
93+
TokenTree::Op(tt) => {
94+
assert_eq!(tt.op(), '=');
95+
assert_eq!(tt.spacing(), Spacing::Alone);
96+
}
8897
_ => panic!("expected equals"),
8998
}
90-
match tokens[2].kind {
91-
TokenNode::Literal(_) => {}
99+
match tokens[2] {
100+
TokenTree::Literal(_) => {}
92101
_ => panic!("expected literal"),
93102
}
94103

95104
*slice = &slice[2..];
96105
}
97106

98107
fn assert_invoc(slice: &mut &[TokenTree]) {
99-
match slice[0].kind {
100-
TokenNode::Op('#', _) => {}
108+
match &slice[0] {
109+
TokenTree::Op(tt) => assert_eq!(tt.op(), '#'),
101110
_ => panic!("expected '#' char"),
102111
}
103-
match slice[1].kind {
104-
TokenNode::Group(Delimiter::Bracket, _) => {}
112+
match &slice[1] {
113+
TokenTree::Group(tt) => assert_eq!(tt.delimiter(), Delimiter::Bracket),
105114
_ => panic!("expected brackets"),
106115
}
107116
*slice = &slice[2..];
108117
}
109118

110119
fn assert_foo(slice: &mut &[TokenTree]) {
111-
match slice[0].kind {
112-
TokenNode::Term(ref name) => assert_eq!(name.as_str(), "fn"),
120+
match &slice[0] {
121+
TokenTree::Term(tt) => assert_eq!(tt.as_str(), "fn"),
113122
_ => panic!("expected fn"),
114123
}
115-
match slice[1].kind {
116-
TokenNode::Term(ref name) => assert_eq!(name.as_str(), "foo"),
124+
match &slice[1] {
125+
TokenTree::Term(tt) => assert_eq!(tt.as_str(), "foo"),
117126
_ => panic!("expected foo"),
118127
}
119-
match slice[2].kind {
120-
TokenNode::Group(Delimiter::Parenthesis, ref s) => assert!(s.is_empty()),
128+
match &slice[2] {
129+
TokenTree::Group(tt) => {
130+
assert_eq!(tt.delimiter(), Delimiter::Parenthesis);
131+
assert!(tt.stream().is_empty());
132+
}
121133
_ => panic!("expected parens"),
122134
}
123-
match slice[3].kind {
124-
TokenNode::Group(Delimiter::Brace, _) => {}
135+
match &slice[3] {
136+
TokenTree::Group(tt) => assert_eq!(tt.delimiter(), Delimiter::Brace),
125137
_ => panic!("expected braces"),
126138
}
127139
*slice = &slice[4..];
@@ -132,22 +144,17 @@ fn fold_stream(input: TokenStream) -> TokenStream {
132144
}
133145

134146
fn fold_tree(input: TokenTree) -> TokenTree {
135-
TokenTree {
136-
span: input.span,
137-
kind: fold_node(input.kind),
138-
}
139-
}
140-
141-
fn fold_node(input: TokenNode) -> TokenNode {
142147
match input {
143-
TokenNode::Group(a, b) => TokenNode::Group(a, fold_stream(b)),
144-
TokenNode::Op(a, b) => TokenNode::Op(a, b),
145-
TokenNode::Term(a) => TokenNode::Term(a),
146-
TokenNode::Literal(a) => {
148+
TokenTree::Group(b) => {
149+
TokenTree::Group(Group::new(b.delimiter(), fold_stream(b.stream())))
150+
}
151+
TokenTree::Op(b) => TokenTree::Op(b),
152+
TokenTree::Term(a) => TokenTree::Term(a),
153+
TokenTree::Literal(a) => {
147154
if a.to_string() != "\"foo\"" {
148-
TokenNode::Literal(a)
155+
TokenTree::Literal(a)
149156
} else {
150-
TokenNode::Literal(Literal::integer(3))
157+
TokenTree::Literal(Literal::i32_unsuffixed(3))
151158
}
152159
}
153160
}

0 commit comments

Comments
 (0)