Home | History | Annotate | Download | only in sepolgen

Lines Matching defs:TOKEN

33 # Regular expression used to match valid token names
45 # Exception thrown when invalid token encountered and no default error
52 # Token class
67 # token() - Get the next token
247 # token() - Return the next token from the Lexer
253 def token(self):
274 # Create a token for return
287 # If no token type was set, it's an ignored token
291 # if func not callable, it means it's an ignored token
295 # If token is processed by a function, call it
298 # Every function must return a token, if nothing, we just move to next token
303 # Verify type of the token. If not in the token map, raise an error
306 raise LexError("%s:%d: Rule '%s' returned an unknown token type '%s'" % (
462 # is a tuple of state names and tokenname is the name of the token. For example,
496 global token,input
529 token = lexobj.token
552 # Build a dictionary of valid token names
557 print("lex: Bad token name '%s'" % n)
560 print("lex: Warning. Token '%s' multiply defined." % n)
617 toknames = { } # Mapping of symbols to token names
730 print("lex: Rule '%s' defined for an unspecified token %s." % (name,tokname))
813 # Create global versions of the token() and input() functions
814 token = lexobj.token
847 _token = lexer.token
849 _token = token
858 # @TOKEN(regex)
864 def TOKEN(r):
870 # Alternative spelling of the TOKEN decorator
871 Token = TOKEN