Antlr - Token

1 - About

3 - Definition

A token can be defined via:

3.1 - Lexer rule

A token is primarily defined via a lexer rule (Lexical rule)

Example:


LOWERCASE = [a-z]+

3.2 - Token Definition Section

The Token definition section of the grammar is a section for token with no associated lexical rule. The tokens section defines a set of tokens to add to the overall set.

The basic syntax is:


tokens { Token1, ..., TokenN }

Usage: Most of the time, the tokens section is used to define token types needed by actions in the grammar

Example: explicitly define keyword token types to avoid implicit definition warnings


tokens { BEGIN, END, IF, THEN, WHILE }

@lexer::members { // keywords map used in lexer to assign token types
Map<String,Integer> keywords = new HashMap<String,Integer>() {{
    put("begin", KeywordsParser.BEGIN);
    put("end", KeywordsParser.END);
    ...
}};
}

4 - File.Tokens

token have their own file after generation


grammar Tok;
tokens { A, B, C }
a : X ;
</file>
  * [[generation|generation the lexer/parser...]]
<code bash>
antlr4 Tok.g4


warning(125): Tok.g4:3:4: implicit definition of token X in parser

  • A tokens file was created

cat Tok.tokens


A=1
B=2
C=3
X=4

5 - Documentation / Reference


Data Science
Data Analysis
Statistics
Data Science
Linear Algebra Mathematics
Trigonometry

Powered by ComboStrap