Skip to content

Commit 66c1fcd

Browse files
hikerlukehoffmannjasonjunweilyu
authored
Update to fparser 0 2 (#373)
* Support new and old style of PSyclone command line (no more nemo api etc) * Fix mypy errors. * Added missing tests for calling psyclone, and converting old style to new stle arguments and vice versa. * Updated comment. * Removed mixing, use a simple regex instead. * Added support for ifx/icx compiler as intel-llvm class. * Added support for nvidia compiler. * Add preliminary support for Cray compiler. * Added Cray compiler wrapper ftn and cc. * Follow a more consistent naming scheme for crays, even though the native compiler names are longer (crayftn-cray, craycc-cray). * Changed names again. * Renamed cray compiler wrapper to be CrayCcWrapper and CrayFtnWrapper, to avoid confusion with Craycc. * Fixed incorrect name in comments. * Additional compilers (#349) * Moved OBJECT_ARCHIVES from constants to ArtefactSet. * Moved PRAGMAD_C from constants to ArtefactSet. * Turned 'all_source' into an enum. * Allow integer as revision. * Fixed flake8 error. * Removed specific functions to add/get fortran source files etc. * Removed non-existing and unneccessary collections. * Try to fix all run_configs. * Fixed rebase issues. * Added replace functionality to ArtefactStore, updated test_artefacts to cover all lines in that file. * Started to replace artefacts when files are pre-processed. * Removed linker argument from linking step in all examples. * Try to get jules to link. * Fixed build_jules. * Fixed other issues raised in reviews. * Try to get jules to link. * Fixed other issues raised in reviews. * Simplify handling of X90 files by replacing the X90 with x90, meaning only one artefact set is involved when running PSyclone. * Make OBJECT_ARCHIVES also a dict, migrate more code to replace/add files to the default build artefact collections. * Fixed some examples. * Fix flake8 error. * Fixed failing tests. * Support empty comments. * Fix preprocessor to not unnecessary remove and add files that are already in the output directory. * Allow find_soure_files to be called more than once by adding files (not replacing artefact). * Updated lfric_common so that files created by configurator are written in build (not source). * Use c_build_files instead of pragmad_c. * Removed unnecessary str. * Documented the new artefact set handling. * Fixed typo. * Make the PSyclone API configurable. * Fixed formatting of documentation, properly used ArtefactSet names. * Support .f and .F Fortran files. * Removed setter for tool.is_available, which was only used for testing. * #3 Fix documentation and coding style issues from review. * Renamed Categories into Category. * Minor coding style cleanup. * Removed more unnecessary (). * Re-added (invalid) grab_pre_build call. * Fixed typo. * Renamed set_default_vendor to set_default_compiler_suite. * Renamed VendorTool to CompilerSuiteTool. * Also accept a Path as exec_name specification for a tool. * Move the check_available function into the base class. * Fixed some types and documentation. * Fix typing error. * Added explanation for meta-compiler. * Improved error handling and documentation. * Replace mpiifort with mpifort to be a tiny bit more portable. * Use classes to group tests for git/svn/fcm together. * Fixed issue in get_transformation script, and moved script into lfric_common to remove code duplication. * Code improvement as suggested by review. * Fixed run config * Added reference to ticket. * Updated type information. * More typing fixes. * Fixed typing warnings. * As requested by reviewer removed is_working_copy functionality. * Issue a warning (which can be silenced) when a tool in a toolbox is replaced. * Fixed flake8. * Fixed flake8. * Fixed failing test. * Addressed issues raised in review. * Removed now unnecessary operations. * Updated some type information. * Fixed all references to APIs to be consistent with PSyclone 2.5. * Added api to the checksum computation. * Fixed type information. * Added test to verify that changing the api changes the checksum. * Make compiler version a tuple of integers * Update some tests to use tuple versions * Explicitly test handling of bad version format * Fix formatting * Tidying up * Make compiler raise an error for any invalid version string Assume these compilers don't need to be hashed. Saves dealing with empty tuples. * Check compiler version string for compiler name * Fix formatting * Add compiler.get_version_string() method Includes other cleanup from PR comments * Add mpi and openmp settings to BuildConfig, made compiler MPI aware. * Looks like the circular dependency has been fixed. * Revert "Looks like the circular dependency has been fixed." ... while it works with the tests, a real application still triggered it. This reverts commit 150dc37. * Don't even try to find a C compiler if no C files are to be compiled. * Updated gitignore to ignore (recently renamed) documentation. * Fixed failing test. * Return from compile Fortran early if there are no files to compiles. Fixed coding style. * Add MPI enables wrapper for intel and gnu compiler. * Fixed test. * Automatically add openmp flag to compiler and linker based on BuildConfig. * Removed enforcement of keyword parameters, which is not supported in python 3.7. * Fixed failing test. * Support more than one tool of a given suite by sorting them. * Use different version checkout for each compiler vendor with mixins * Refactoring, remove unittest compiler class * Fix some mypy errors * Use 'Union' type hint to fix build checks * Added option to add flags to a tool. * Introduce proper compiler wrapper, used this to implement properly wrapper MPI compiler. * Fixed typo in types. * Return run_version_command to base Compiler class Provides default version command that can be overridden for other compilers. Also fix some incorrect tests Other tidying * Add a missing type hint * Added (somewhat stupid) 'test' to reach 100% coverage of PSyclone tool. * Simplified MPI support in wrapper. * More compiler wrapper coverage. * Removed duplicated function. * Removed debug print. * Removed permanently changing compiler attributes, which can cause test failures later. * More test for C compiler wrapper. * More work on compiler wrapper tests. * Fixed version and availability handling, added missing tests for 100% coverage. * Fixed typing error. * Try to fix python 3.7. * Tried to fix failing tests. * Remove inheritance from mixins and use protocol * Simplify compiler inheritance Mixins have static methods with unique names, overrides only happen in concrete classes * Updated wrapper and tests to handle error raised in get_version. * Simplified regular expressions (now tests cover detection of version numbers with only a major version). * Test for missing mixin. * Use the parsing mixing from the compiler in a compiler wrapper. * Use setattr instead of assignment to make mypy happy. * Simplify usage of compiler-specific parsing mixins. * Minor code cleanup. * Updated documentation. * Simplify usage of compiler-specific parsing mixins. * Test for missing mixin. * Fixed test. * Added missing openmp_flag property to compiler wrapper. * Don't use isinstance for consistency check, which does not work for CompilerWrappers. * Fixed isinstance test for C compilation which doesn't work with a CompilerWrapper. * Use a linker's compiler to determine MPI support. Removed mpi property from CompilerSuite. * Added more tests for invalid version numbers. * Added more test cases for invalid version number, improved regex to work as expected. * Fixed typo in test. * Fixed flake/mypy errors. * Combine wrapper flags with flags from wrapped compiler. * Made mypy happy. * Fixed test. * Split tests into smaller individual ones, fixed missing asssert in test. * Parameterised compiler version tests to also test wrapper. * Added missing MPI parameter when getting the compiler. * Fixed comments. * Order parameters to be in same order for various compiler classes. * Remove stray character * Added getter for wrapped compiler. * Fixed small error that would prevent nested compiler wrappers from being used. * Added a cast to make mypy happy. * Add simple getter for linker library flags * Add getter for linker flags by library * Fix formatting * Add optional libs argument to link function * Reorder and clean up linker tests * Make sure `Linker.link()` raises for unknown lib * Add missing type * Fix typing error * Add 'libs' argument to link_exe function * Try to add documentation for the linker libs feature * Use correct list type in link_exe hint * Add silent replace option to linker.add_lib_flags * Fixed spelling mistake in option. * Clarified documentation. * Removed unnecessary functions in CompilerWrapper. * Fixed failing test triggered by executing them in specific order (tools then steps) * Fixed line lengths. * Add tests for linker LDFLAG * Add pre- and post- lib flags to link function * Fix syntax in built-in lib flags * Remove netcdf as a built-in linker library Bash-style substitution is not currently handled * Configure pre- and post-lib flags on the Linker object Previously they were passed into the Linker.link() function * Use more realistic linker lib flags * Formatting fix * Removed mixing, use a simple regex instead. * Added support for ifx/icx compiler as intel-llvm class. * Added support for nvidia compiler. * Add preliminary support for Cray compiler. * Added Cray compiler wrapper ftn and cc. * Made mpi and openmp default to False in the BuildConfig constructor. * Removed white space. * Follow a more consistent naming scheme for crays, even though the native compiler names are longer (crayftn-cray, craycc-cray). * Changed names again. * Support compilers that do not support OpenMP. * Added documentation for openmp parameter. * Renamed cray compiler wrapper to be CrayCcWrapper and CrayFtnWrapper, to avoid confusion with Craycc. * Fixed incorrect name in comments. --------- Co-authored-by: jasonjunweilyu <[email protected]> Co-authored-by: Luke Hoffmann <[email protected]> Co-authored-by: Luke Hoffmann <[email protected]> * Support new and old style of PSyclone command line (no more nemo api etc) * Fix mypy errors. * Added missing tests for calling psyclone, and converting old style to new stle arguments and vice versa. * Added shell tool. * Try to make mypy happy. * Removed debug code. * ToolRepository now only returns default that are available. Updated tests to make tools as available. * Fixed typos and coding style. * Support new and old style of PSyclone command line (no more nemo api etc) * Fix mypy errors. * Added missing tests for calling psyclone, and converting old style to new stle arguments and vice versa. * Updated comment. * Fixed failing tests. * Updated fparser dependency to version 0.2. * Replace old code for handling sentinels with triggering this behaviour in fparser. Require config in constructor of Analyser classes. * Fixed tests for latest changes. * Removed invalid openmp continuation line - since now fparser fails when trying to parse this line. * Added test for disabled openmp parsing. Updated test to work with new test file. * Coding style changes. * Fix flake issues. * Fixed double _. * Removed more accesses to private members. * Added missing type hint. * Make flake8 happy. --------- Co-authored-by: Luke Hoffmann <[email protected]> Co-authored-by: jasonjunweilyu <[email protected]> Co-authored-by: Luke Hoffmann <[email protected]>
1 parent 6a2b40f commit 66c1fcd

File tree

12 files changed

+140
-110
lines changed

12 files changed

+140
-110
lines changed

pyproject.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ authors = [
77
license = {file = 'LICENSE.txt'}
88
dynamic = ['version', 'readme']
99
requires-python = '>=3.7, <4'
10-
dependencies = ['fparser']
10+
dependencies = ['fparser >= 0.2']
1111
classifiers = [
1212
'Development Status :: 1 - Planning',
1313
'Environment :: Console',

source/fab/parse/c.py

Lines changed: 24 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -11,44 +11,48 @@
1111
from pathlib import Path
1212
from typing import List, Optional, Union, Tuple
1313

14-
from fab.dep_tree import AnalysedDependent
15-
1614
try:
1715
import clang # type: ignore
1816
import clang.cindex # type: ignore
1917
except ImportError:
2018
clang = None
2119

20+
from fab.build_config import BuildConfig
21+
from fab.dep_tree import AnalysedDependent
2222
from fab.util import log_or_dot, file_checksum
2323

2424
logger = logging.getLogger(__name__)
2525

2626

2727
class AnalysedC(AnalysedDependent):
2828
"""
29-
An analysis result for a single C file, containing symbol definitions and dependencies.
29+
An analysis result for a single C file, containing symbol definitions and
30+
dependencies.
3031
31-
Note: We don't need to worry about compile order with pure C projects; we can compile all in one go.
32-
However, with a *Fortran -> C -> Fortran* dependency chain, we do need to ensure that one Fortran file
33-
is compiled before another, so this class must be part of the dependency tree analysis.
32+
Note: We don't need to worry about compile order with pure C projects; we
33+
can compile all in one go. However, with a *Fortran -> C -> Fortran*
34+
dependency chain, we do need to ensure that one Fortran file is
35+
compiled before another, so this class must be part of the
36+
dependency tree analysis.
3437
3538
"""
36-
# Note: This subclass adds nothing to it's parent, which provides everything it needs.
37-
# We'd normally remove an irrelevant class like this but we want to keep the door open
38-
# for filtering analysis results by type, rather than suffix.
39-
pass
39+
# Note: This subclass adds nothing to it's parent, which provides
40+
# everything it needs. We'd normally remove an irrelevant class
41+
# like this but we want to keep the door open for filtering
42+
# analysis results by type, rather than suffix.
4043

4144

42-
class CAnalyser(object):
45+
class CAnalyser:
4346
"""
4447
Identify symbol definitions and dependencies in a C file.
4548
4649
"""
4750

48-
def __init__(self):
51+
def __init__(self, config: BuildConfig):
4952

5053
# runtime
51-
self._config = None
54+
self._config = config
55+
self._include_region: List[Tuple[int, str]] = []
5256

5357
# todo: simplifiy by passing in the file path instead of the analysed tokens?
5458
def _locate_include_regions(self, trans_unit) -> None:
@@ -100,8 +104,7 @@ def _check_for_include(self, lineno) -> Optional[str]:
100104
include_stack.pop()
101105
if include_stack:
102106
return include_stack[-1]
103-
else:
104-
return None
107+
return None
105108

106109
def run(self, fpath: Path) \
107110
-> Union[Tuple[AnalysedC, Path], Tuple[Exception, None]]:
@@ -149,9 +152,11 @@ def run(self, fpath: Path) \
149152
continue
150153
logger.debug('Considering node: %s', node.spelling)
151154

152-
if node.kind in {clang.cindex.CursorKind.FUNCTION_DECL, clang.cindex.CursorKind.VAR_DECL}:
155+
if node.kind in {clang.cindex.CursorKind.FUNCTION_DECL,
156+
clang.cindex.CursorKind.VAR_DECL}:
153157
self._process_symbol_declaration(analysed_file, node, usr_symbols)
154-
elif node.kind in {clang.cindex.CursorKind.CALL_EXPR, clang.cindex.CursorKind.DECL_REF_EXPR}:
158+
elif node.kind in {clang.cindex.CursorKind.CALL_EXPR,
159+
clang.cindex.CursorKind.DECL_REF_EXPR}:
155160
self._process_symbol_dependency(analysed_file, node, usr_symbols)
156161
except Exception as err:
157162
logger.exception(f'error walking parsed nodes {fpath}')
@@ -166,7 +171,8 @@ def _process_symbol_declaration(self, analysed_file, node, usr_symbols):
166171
if node.is_definition():
167172
# only global symbols can be used by other files, not static symbols
168173
if node.linkage == clang.cindex.LinkageKind.EXTERNAL:
169-
# This should catch function definitions which are exposed to the rest of the application
174+
# This should catch function definitions which are exposed to
175+
# the rest of the application
170176
logger.debug(' * Is defined in this file')
171177
# todo: ignore if inside user pragmas?
172178
analysed_file.add_symbol_def(node.spelling)

source/fab/parse/fortran.py

Lines changed: 10 additions & 31 deletions
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,6 @@
1111
from pathlib import Path
1212
from typing import Union, Optional, Iterable, Dict, Any, Set
1313

14-
from fparser.common.readfortran import FortranStringReader # type: ignore
1514
from fparser.two.Fortran2003 import ( # type: ignore
1615
Entity_Decl_List, Use_Stmt, Module_Stmt, Program_Stmt, Subroutine_Stmt, Function_Stmt, Language_Binding_Spec,
1716
Char_Literal_Constant, Interface_Block, Name, Comment, Module, Call_Stmt, Derived_Type_Def, Derived_Type_Stmt,
@@ -21,6 +20,7 @@
2120
from fparser.two.Fortran2008 import ( # type: ignore
2221
Type_Declaration_Stmt, Attr_Spec_List)
2322

23+
from fab.build_config import BuildConfig
2424
from fab.dep_tree import AnalysedDependent
2525
from fab.parse.fortran_common import iter_content, _has_ancestor_type, _typed_child, FortranAnalyserBase
2626
from fab.util import file_checksum, string_checksum
@@ -167,15 +167,21 @@ class FortranAnalyser(FortranAnalyserBase):
167167
A build step which analyses a fortran file using fparser2, creating an :class:`~fab.dep_tree.AnalysedFortran`.
168168
169169
"""
170-
def __init__(self, std=None, ignore_mod_deps: Optional[Iterable[str]] = None):
170+
def __init__(self,
171+
config: BuildConfig,
172+
std: Optional[str] = None,
173+
ignore_mod_deps: Optional[Iterable[str]] = None):
171174
"""
175+
:param config: The BuildConfig to use.
172176
:param std:
173177
The Fortran standard.
174178
:param ignore_mod_deps:
175179
Module names to ignore in use statements.
176180
177181
"""
178-
super().__init__(result_class=AnalysedFortran, std=std)
182+
super().__init__(config=config,
183+
result_class=AnalysedFortran,
184+
std=std)
179185
self.ignore_mod_deps: Iterable[str] = list(ignore_mod_deps or [])
180186
self.depends_on_comment_found = False
181187

@@ -295,33 +301,6 @@ def _process_comment(self, analysed_file, obj):
295301
# without .o means a fortran symbol
296302
else:
297303
analysed_file.add_symbol_dep(dep)
298-
if comment[:2] == "!$":
299-
# Check if it is a use statement with an OpenMP sentinel:
300-
# Use fparser's string reader to discard potential comment
301-
# TODO #327: once fparser supports reading the sentinels,
302-
# this can be removed.
303-
# fparser issue: https://github.com/stfc/fparser/issues/443
304-
reader = FortranStringReader(comment[2:])
305-
try:
306-
line = reader.next()
307-
except StopIteration:
308-
# No other item, ignore
309-
return
310-
try:
311-
# match returns a 5-tuple, the third one being the module name
312-
module_name = Use_Stmt.match(line.strline)[2]
313-
module_name = module_name.string
314-
except Exception:
315-
# Not a use statement in a sentinel, ignore:
316-
return
317-
318-
# Register the module name
319-
if module_name in self.ignore_mod_deps:
320-
logger.debug(f"ignoring use of {module_name}")
321-
return
322-
if module_name.lower() not in self._intrinsic_modules:
323-
# found a dependency on fortran
324-
analysed_file.add_module_dep(module_name)
325304

326305
def _process_subroutine_or_function(self, analysed_file, fpath, obj):
327306
# binding?
@@ -353,7 +332,7 @@ def _process_subroutine_or_function(self, analysed_file, fpath, obj):
353332
analysed_file.add_symbol_def(name.string)
354333

355334

356-
class FortranParserWorkaround(object):
335+
class FortranParserWorkaround():
357336
"""
358337
Use this class to create a workaround when the third-party Fortran parser is unable to process a valid source file.
359338

source/fab/parse/fortran_common.py

Lines changed: 51 additions & 25 deletions
Original file line numberDiff line numberDiff line change
@@ -10,13 +10,14 @@
1010
import logging
1111
from abc import ABC, abstractmethod
1212
from pathlib import Path
13-
from typing import Union, Tuple, Type
13+
from typing import Optional, Tuple, Type, Union
1414

1515
from fparser.common.readfortran import FortranFileReader # type: ignore
1616
from fparser.two.parser import ParserFactory # type: ignore
1717
from fparser.two.utils import FortranSyntaxError # type: ignore
1818

1919
from fab import FabException
20+
from fab.build_config import BuildConfig
2021
from fab.dep_tree import AnalysedDependent
2122
from fab.parse import EmptySourceFile
2223
from fab.util import log_or_dot, file_checksum
@@ -58,49 +59,61 @@ def _typed_child(parent, child_type: Type, must_exist=False):
5859
# Look for a child of a certain type.
5960
# Returns the child or None.
6061
# Raises ValueError if more than one child of the given type is found.
61-
children = list(filter(lambda child: isinstance(child, child_type), parent.children))
62+
children = list(filter(lambda child: isinstance(child, child_type),
63+
parent.children))
6264
if len(children) > 1:
6365
raise ValueError(f"too many children found of type {child_type}")
6466

6567
if children:
6668
return children[0]
6769

6870
if must_exist:
69-
raise FabException(f'Could not find child of type {child_type} in {parent}')
71+
raise FabException(f'Could not find child of type {child_type} '
72+
f'in {parent}')
7073
return None
7174

7275

7376
class FortranAnalyserBase(ABC):
7477
"""
75-
Base class for Fortran parse-tree analysers, e.g FortranAnalyser and X90Analyser.
78+
Base class for Fortran parse-tree analysers, e.g FortranAnalyser and
79+
X90Analyser.
7680
7781
"""
7882
_intrinsic_modules = ['iso_fortran_env', 'iso_c_binding']
7983

80-
def __init__(self, result_class, std=None):
84+
def __init__(self, config: BuildConfig,
85+
result_class,
86+
std: Optional[str] = None):
8187
"""
88+
:param config: The BuildConfig object.
8289
:param result_class:
8390
The type (class) of the analysis result. Defined by the subclass.
8491
:param std:
8592
The Fortran standard.
8693
8794
"""
95+
self._config = config
8896
self.result_class = result_class
8997
self.f2008_parser = ParserFactory().create(std=std or "f2008")
9098

91-
# todo: this, and perhaps other runtime variables like it, might be better set at construction
92-
# if we construct these objects at runtime instead...
93-
# runtime, for child processes to read
94-
self._config = None
99+
@property
100+
def config(self) -> BuildConfig:
101+
'''Returns the BuildConfig to use.
102+
'''
103+
return self._config
95104

96105
def run(self, fpath: Path) \
97-
-> Union[Tuple[AnalysedDependent, Path], Tuple[EmptySourceFile, None], Tuple[Exception, None]]:
106+
-> Union[Tuple[AnalysedDependent, Path],
107+
Tuple[EmptySourceFile, None],
108+
Tuple[Exception, None]]:
98109
"""
99-
Parse the source file and record what we're interested in (subclass specific).
110+
Parse the source file and record what we're interested in (subclass
111+
specific).
100112
101113
Reloads previous analysis results if available.
102114
103-
Returns the analysis data and the result file where it was stored/loaded.
115+
Returns the analysis data and the result file where it was
116+
stored/loaded.
104117
105118
"""
106119
# calculate the prebuild filename
@@ -114,9 +127,11 @@ def run(self, fpath: Path) \
114127
# Load the result file into whatever result class we use.
115128
loaded_result = self.result_class.load(analysis_fpath)
116129
if loaded_result:
117-
# This result might have been created by another user; their prebuild folder copied to ours.
118-
# If so, the fpath in the result will *not* point to the file we eventually want to compile,
119-
# it will point to the user's original file, somewhere else. So replace it with our own path.
130+
# This result might have been created by another user; their
131+
# prebuild folder copied to ours. If so, the fpath in the
132+
# result will *not* point to the file we eventually want to
133+
# compile, it will point to the user's original file,
134+
# somewhere else. So replace it with our own path.
120135
loaded_result.fpath = fpath
121136
return loaded_result, analysis_fpath
122137

@@ -125,43 +140,54 @@ def run(self, fpath: Path) \
125140
# parse the file, get a node tree
126141
node_tree = self._parse_file(fpath=fpath)
127142
if isinstance(node_tree, Exception):
128-
return Exception(f"error parsing file '{fpath}':\n{node_tree}"), None
143+
return (Exception(f"error parsing file '{fpath}':\n{node_tree}"),
144+
None)
129145
if node_tree.content[0] is None:
130146
logger.debug(f" empty tree found when parsing {fpath}")
131-
# todo: If we don't save the empty result we'll keep analysing it every time!
147+
# todo: If we don't save the empty result we'll keep analysing
148+
# it every time!
132149
return EmptySourceFile(fpath), None
133150

134151
# find things in the node tree
135-
analysed_file = self.walk_nodes(fpath=fpath, file_hash=file_hash, node_tree=node_tree)
152+
analysed_file = self.walk_nodes(fpath=fpath, file_hash=file_hash,
153+
node_tree=node_tree)
136154
analysed_file.save(analysis_fpath)
137155

138156
return analysed_file, analysis_fpath
139157

140158
def _get_analysis_fpath(self, fpath, file_hash) -> Path:
141-
return Path(self._config.prebuild_folder / f'{fpath.stem}.{file_hash}.an')
159+
return Path(self.config.prebuild_folder /
160+
f'{fpath.stem}.{file_hash}.an')
142161

143162
def _parse_file(self, fpath):
144163
"""Get a node tree from a fortran file."""
145-
reader = FortranFileReader(str(fpath), ignore_comments=False)
146-
reader.exit_on_error = False # don't call sys.exit, it messes up the multi-processing
164+
reader = FortranFileReader(
165+
str(fpath),
166+
ignore_comments=False,
167+
include_omp_conditional_lines=self.config.openmp)
168+
# don't call sys.exit, it messes up the multi-processing
169+
reader.exit_on_error = False
147170

148171
try:
149172
tree = self.f2008_parser(reader)
150173
return tree
151174
except FortranSyntaxError as err:
152-
# we can't return the FortranSyntaxError, it breaks multiprocessing!
175+
# Don't return the FortranSyntaxError, it breaks multiprocessing!
153176
logger.error(f"\nfparser raised a syntax error in {fpath}\n{err}")
154177
return Exception(f"syntax error in {fpath}\n{err}")
155178
except Exception as err:
156179
logger.error(f"\nunhandled error '{type(err)}' in {fpath}\n{err}")
157-
return Exception(f"unhandled error '{type(err)}' in {fpath}\n{err}")
180+
return Exception(f"unhandled error '{type(err)}' in "
181+
f"{fpath}\n{err}")
158182

159183
@abstractmethod
160184
def walk_nodes(self, fpath, file_hash, node_tree) -> AnalysedDependent:
161185
"""
162-
Examine the nodes in the parse tree, recording things we're interested in.
186+
Examine the nodes in the parse tree, recording things we're
187+
interested in.
163188
164-
Return type depends on our subclass, and will be a subclass of AnalysedDependent.
189+
Return type depends on our subclass, and will be a subclass of
190+
AnalysedDependent.
165191
166192
"""
167193
raise NotImplementedError

source/fab/parse/x90.py

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,7 @@
99
from fparser.two.Fortran2003 import Use_Stmt, Call_Stmt, Name, Only_List, Actual_Arg_Spec_List, Part_Ref # type: ignore
1010

1111
from fab.parse import AnalysedFile
12+
from fab.build_config import BuildConfig
1213
from fab.parse.fortran_common import FortranAnalyserBase, iter_content, logger, _typed_child
1314
from fab.util import by_type
1415

@@ -64,8 +65,8 @@ class X90Analyser(FortranAnalyserBase):
6465
# Makes a parsable fortran version of x90.
6566
# todo: Use hashing to reuse previous analysis results.
6667

67-
def __init__(self):
68-
super().__init__(result_class=AnalysedX90)
68+
def __init__(self, config: BuildConfig):
69+
super().__init__(config=config, result_class=AnalysedX90)
6970

7071
def walk_nodes(self, fpath, file_hash, node_tree) -> AnalysedX90: # type: ignore
7172

source/fab/steps/analyse.py

Lines changed: 4 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -130,8 +130,10 @@ def analyse(
130130
unreferenced_deps = list(unreferenced_deps or [])
131131

132132
# todo: these seem more like functions
133-
fortran_analyser = FortranAnalyser(std=std, ignore_mod_deps=ignore_mod_deps)
134-
c_analyser = CAnalyser()
133+
fortran_analyser = FortranAnalyser(config=config,
134+
std=std,
135+
ignore_mod_deps=ignore_mod_deps)
136+
c_analyser = CAnalyser(config=config)
135137

136138
# Creates the *build_trees* artefact from the files in `self.source_getter`.
137139

@@ -144,10 +146,6 @@ def analyse(
144146
# - At this point we have a source tree for the entire source.
145147
# - (Optionally) Extract a sub tree for every root symbol, if provided. For building executables.
146148

147-
# todo: code smell - refactor (in another PR to keep things small)
148-
fortran_analyser._config = config
149-
c_analyser._config = config
150-
151149
# parse
152150
files: List[Path] = source_getter(config.artefact_store)
153151
analysed_files = _parse_files(config, files=files, fortran_analyser=fortran_analyser, c_analyser=c_analyser)

0 commit comments

Comments
 (0)