mirror of
https://github.com/PyCQA/flake8.git
synced 2026-04-04 04:06:54 +00:00
Add support for tokens of a complete file
The `tokens` property of the `FileProcessor` class only contains tokens of the current line but not all tokens. So if a plugin which is only executed once per file, that property is useless. To make the tokens also available to plugins it is now be able to supply all the tokens of a file. It also updates the documentation to separate which parameters are static and which are changed on each line. Using the latter parameters on plugins which are only run once per file isn't very sensible.
This commit is contained in:
parent
c3ee4829ed
commit
9cf8603e94
2 changed files with 27 additions and 4 deletions
|
|
@ -34,18 +34,27 @@ a file, a plugin can ask for any of the following:
|
|||
- :attr:`~flake8.processor.FileProcessor.indent_level`
|
||||
- :attr:`~flake8.processor.FileProcessor.line_number`
|
||||
- :attr:`~flake8.processor.FileProcessor.logical_line`
|
||||
- :attr:`~flake8.processor.FileProcessor.max_line_length`
|
||||
- :attr:`~flake8.processor.FileProcessor.multiline`
|
||||
- :attr:`~flake8.processor.FileProcessor.noqa`
|
||||
- :attr:`~flake8.processor.FileProcessor.previous_indent_level`
|
||||
- :attr:`~flake8.processor.FileProcessor.previous_logical`
|
||||
- :attr:`~flake8.processor.FileProcessor.tokens`
|
||||
|
||||
Some properties are set once per file for plugins which iterate itself over
|
||||
the data instead of being called on each physical or logical line.
|
||||
|
||||
- :attr:`~flake8.processor.FileProcessor.filename`
|
||||
- :attr:`~flake8.processor.FileProcessor.file_tokens`
|
||||
- :attr:`~flake8.processor.FileProcessor.lines`
|
||||
- :attr:`~flake8.processor.FileProcessor.max_line_length`
|
||||
- :attr:`~flake8.processor.FileProcessor.total_lines`
|
||||
- :attr:`~flake8.processor.FileProcessor.verbose`
|
||||
|
||||
Alternatively, a plugin can accept ``tree`` and ``filename``.
|
||||
``tree`` will be a parsed abstract syntax tree that will be used by plugins
|
||||
like PyFlakes and McCabe.
|
||||
These parameters can also be supplied to plugins working on each line
|
||||
separately. Additionally, plugins called once per file can also accept ``tree``
|
||||
which is not supplied as a parameter of
|
||||
:class:`~flake8.processor.FileProcessor`, which will be a parsed abstract
|
||||
syntax tree. It is used by plugins like PyFlakes and McCabe.
|
||||
|
||||
|
||||
Registering Options
|
||||
|
|
|
|||
|
|
@ -43,6 +43,7 @@ class FileProcessor(object):
|
|||
- :attr:`previous_indent_level`
|
||||
- :attr:`previous_logical`
|
||||
- :attr:`tokens`
|
||||
- :attr:`file_tokens`
|
||||
- :attr:`total_lines`
|
||||
- :attr:`verbose`
|
||||
"""
|
||||
|
|
@ -101,6 +102,19 @@ class FileProcessor(object):
|
|||
self.statistics = {
|
||||
'logical lines': 0,
|
||||
}
|
||||
self._file_tokens = None
|
||||
|
||||
@property
|
||||
def file_tokens(self):
|
||||
if self._file_tokens is None:
|
||||
line_iter = iter(self.lines)
|
||||
try:
|
||||
self._file_tokens = list(tokenize.generate_tokens(
|
||||
lambda: next(line_iter)))
|
||||
except tokenize.TokenError as exc:
|
||||
raise exceptions.InvalidSyntax(exc.message, exception=exc)
|
||||
|
||||
return self._file_tokens[:]
|
||||
|
||||
@contextlib.contextmanager
|
||||
def inside_multiline(self, line_number):
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue