Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions changelog
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
46) PR #3319 for #3318. Alters the logging set-up performed by PSyclone
so that it does not affect the root Python logger.

45) PR #3125 towards #3060. IntrinsicCall reference_accesses now returns
appropriate access_types (instead of a generic READWRITE).

Expand Down
9 changes: 5 additions & 4 deletions doc/developer_guide/coding-style.rst
Original file line number Diff line number Diff line change
Expand Up @@ -161,13 +161,14 @@ Logging

PSyclone supports the Python ``logging`` module, to help report useful
information throughout the code. By default the logging infrastructure
is switched off, but can be enabled either through the PSyclone command's
is switched off, but can be enabled either through the PSyclone command-line
options or programmatically with:

.. code-block:: python

import logging
logging.basicConfig(level=logging.LEVEL)
logger = logging.getLogger("psyclone")
logger.setLevel(logging.LEVEL)

where ``logging.LEVEL`` is the required level of those defined in the standard
logging library.
Expand Down Expand Up @@ -199,8 +200,8 @@ standard ``warnings`` import, i.e.:

warnings.warn(self._deprecation_warning, DeprecationWarning, 2)

Logging should also be explicitly tested through the use of the ``caplog``
pytest fixture.
Care must be taken when testing logging calls added to PSyclone. See
:ref:`logging_testing` for details.


.. _interface_description:
Expand Down
23 changes: 23 additions & 0 deletions doc/developer_guide/working_practises.rst
Original file line number Diff line number Diff line change
Expand Up @@ -175,6 +175,29 @@ provided to the pytest command line. It also ensures that (if
compilation testing is enabled) the LFRic-stub and GOcean infrastructure
libraries are compiled prior to any tests running.

.. _logging_testing:

Testing PSyclone's Logging Calls
--------------------------------
Logging should also be explicitly tested through the use of the ``caplog``
pytest fixture. It is important that the expected logger is specified
in the test: if `main` (in `generate.py`) is executed, a handler is attached
to the PSyclone top-level logger. This means that no log events are
propagated to the Python root level logger, and ``caplog`` without
a logger specified will only test the root logger. This can result in
tests passing when executed on their own, but failing randomly when
executed with other tests: these test might call ``main`` (and
therefore prevent the root logger to receive the message), and
``caplog`` will then fail (if no logger is specified). Here is the
proper way of testing log messages in a test:

.. code-block:: python

with caplog.at_level(logging.WARNING,
logger="psyclone.psyir.tools.read_write_info"):
...
assert "your message here" in caplog.text


.. _test_coverage:

Expand Down
5 changes: 2 additions & 3 deletions doc/user_guide/psyclone_command.rst
Original file line number Diff line number Diff line change
Expand Up @@ -521,8 +521,7 @@ Enabling the Logging Infrastructure
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

PSyclone supports logging which can provide additional information
on what is happening inside PSyclone. This logging will also
control the behaviour of any logging calls inside a user script.
on what is happening inside PSyclone.

Logging output can be controlled through the ``--log-level`` option.
By default, logging is set to ``OFF``, which means
Expand All @@ -532,7 +531,7 @@ detailed in the ``psyclone -h`` information.
By default the output from the logging goes into stderr.
To control the logging output, PSyclone provides the
``--log-file`` option. If this is set, the logging output will instead
be directed to the provided file.
be directed to the specified file.

Keeping Comments and Directives
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Expand Down
25 changes: 20 additions & 5 deletions src/psyclone/generator.py
Original file line number Diff line number Diff line change
Expand Up @@ -620,14 +620,29 @@ def main(arguments):
args = parser.parse_args(arguments)

# Set the logging system up.
loglevel = LOG_LEVELS[args.log_level]
if args.log_file:
logname = args.log_file
logging.basicConfig(filename=logname,
level=loglevel)
handler = logging.FileHandler(args.log_file, mode="a",
encoding="utf-8")
else:
logging.basicConfig(level=loglevel)
handler = logging.StreamHandler()

# We set both top-level loggers for PSyclone and fparser to
# the same configuration:
logger_psyclone = logging.getLogger('psyclone')
logger_fparser = logging.getLogger('fparser')
formatter = logging.Formatter('%(levelname)s:%(name)s:%(message)s')
for logger in [logger_fparser, logger_psyclone]:
# Certain tests call main several times, which would add handlers
# over and over (which results in duplicated messages).
# So, don't attach a handler if one is already there
if logger.handlers:
continue
logger.setLevel(LOG_LEVELS[args.log_level])
handler.setFormatter(formatter)
logger.addHandler(handler)

logger = logging.getLogger(__name__)

logger.debug("Logging system initialised. Level is %s.", args.log_level)

# Validate that the given arguments are for the right operation mode
Expand Down
2 changes: 1 addition & 1 deletion src/psyclone/tests/configuration_test.py
Original file line number Diff line number Diff line change
Expand Up @@ -633,7 +633,7 @@ def test_deprecated_access_mapping(tmpdir, caplog):
("[lfric]\naccess_mapping = gh_read: read, gh_write: write, "
"gh_readwrite: readwrite,\n gh_inc: inc, "
"gh_sum: sum\n"))
with caplog.at_level(logging.WARN):
with caplog.at_level(logging.WARN, logger="psyclone.configuration"):
_ = get_config(config_file, content)

assert "Configuration file contains an ACCESS_MAPPING entry" in caplog.text
Expand Down
34 changes: 33 additions & 1 deletion src/psyclone/tests/conftest.py
Original file line number Diff line number Diff line change
Expand Up @@ -39,8 +39,11 @@
''' Module which performs pytest set-up so that we can specify
command-line options. Also creates certain test fixtures. '''

import os
import copy
import logging
import os
import sys

import pytest

from fparser.two.parser import ParserFactory
Expand Down Expand Up @@ -94,6 +97,35 @@ def have_graphviz(): # pragma: no-cover
return True


@pytest.fixture(scope="function", autouse=True)
def setup_logging():
"""
This fixture sets up logging the same way `main` does.
Using this ensures that any caplog tests must specify the expected
logger. If `main` is executed, a handler is attached to
the psyclone top-level logger, which means the messages are not
propagated to the python root logger, and caplog `at_level(x)`
will fail (since it only tests the root logger, which is not called).
This can result in inconsistent test failures depending on order
of test execution: test will pass if `main` has not been executed
by another test previously (since no handler is then attached),
and will fail if `main` has been executed previously.
The proper way of testing with caplog is:
caplog.at_level(x, logger="psyclone.module.module...")
"""

logger_psyclone = logging.getLogger('psyclone')
# In case that some tests have added handlers - remove them all
while logger_psyclone.handlers:
logger_psyclone.removeHandler(logger_psyclone.handlers[0])

# Then add exactly one handler (which is what happens in main):
handler = logging.StreamHandler()
logger_psyclone.addHandler(handler)
# Disable the logger, which is the default
logger_psyclone.setLevel(sys.maxsize)


@pytest.fixture(scope="session", autouse=True)
def setup_psyclone_config():
'''This per session fixture defines the environment variable
Expand Down
76 changes: 47 additions & 29 deletions src/psyclone/tests/generator_test.py
Original file line number Diff line number Diff line change
Expand Up @@ -485,7 +485,7 @@ def _broken(_1, _2):
raise ValueError("This is a test")

monkeypatch.setattr(FortranReader, "psyir_from_file", _broken)
with caplog.at_level(logging.ERROR):
with caplog.at_level(logging.ERROR, logger="psyclone.generator"):
with pytest.raises(SystemExit):
_ = generate(
os.path.join(BASE_PATH, "gocean1p0", "single_invoke.f90"),
Expand Down Expand Up @@ -786,7 +786,49 @@ def test_main_invalid_api(capsys):
assert output == expected_output


def test_main_api(capsys, caplog):
def test_main_logger(capsys, caplog, tmp_path):
"""
Test the setup of the logger.
"""

# The conftest `setup_logging` fixture will add a handler to the
# PSyclone top-level logger - meaning the corresponding line in
# generator.py is not executed. Remove the handler here so we
# trigger adding a handler in generator.py
logger = logging.getLogger("psyclone")
logger.removeHandler(logger.handlers[0])

filename = os.path.join(NEMO_BASE_PATH, "explicit_do_long_line.f90")
# Give invalid logging level
# Reset capsys
capsys.readouterr()
with pytest.raises(SystemExit):
main([filename, "-api", "lfric", "--log-level", "fail"])
_, err = capsys.readouterr()
# Error message check truncated as Python 3.13 changes how the
# array is output.
assert ("error: argument --log-level: invalid choice: 'fail'"
in err)

# Test we get the logging debug correctly with caplog, including
# redirection into a file:
caplog.clear()
out_file = str(tmp_path / "test.out")
with caplog.at_level(logging.DEBUG):
main([filename, "-api", "dynamo0.3", "--log-level", "DEBUG",
"--log-file", out_file])
assert Config.get().api == "lfric"
assert caplog.records[0].levelname == "DEBUG"
assert "Logging system initialised. Level is DEBUG." in caplog.text
# Check that we have a file handler installed as expected
file_handlers = [h for h in logger.handlers
if isinstance(h, logging.FileHandler)]
# There should be exactly one file handler, pointing to out_file:
assert len(file_handlers) == 1
assert file_handlers[0].baseFilename == out_file


def test_main_api():
''' Test that the API can be set by a command line parameter, also using
the API name aliases. '''

Expand Down Expand Up @@ -816,30 +858,6 @@ def test_main_api(capsys, caplog):
main([filename, "-api", "dynamo0.3"])
assert Config.get().api == "lfric"

# Give invalid logging level
# Reset capsys
capsys.readouterr()
with pytest.raises(SystemExit):
main([filename, "-api", "dynamo0.3", "--log-level", "fail"])
_, err = capsys.readouterr()
# Error message check truncated as Python 3.13 changes how the
# array is output.
assert ("error: argument --log-level: invalid choice: 'fail'"
in err)

# Test we get the logging debug correctly with caplog. This
# overrides the file output that PSyclone attempts.
caplog.clear()
# Pytest fully controls the logging level, overriding anything we
# set in generator.main so we can't test for it.
with caplog.at_level(logging.DEBUG):
main([filename, "-api", "dynamo0.3", "--log-level", "DEBUG",
"--log-file", "test.out"])
assert Config.get().api == "lfric"
assert caplog.records[0].levelname == "DEBUG"
assert ("Logging system initialised. Level is DEBUG." in
caplog.record_tuples[0][2])


def test_keep_comments_and_keep_directives(capsys, caplog, tmpdir_factory):
''' Test the keep comments and keep directives arguments to main. '''
Expand Down Expand Up @@ -899,7 +917,7 @@ def test_keep_comments_and_keep_directives(capsys, caplog, tmpdir_factory):
"""
assert output == correct

with caplog.at_level(logging.WARNING):
with caplog.at_level(logging.WARNING, logger="psyclone.generator"):
main([filename, "--keep-directives"])
assert ("keep_directives requires keep_comments so "
"PSyclone enabled keep_comments." in caplog.text)
Expand Down Expand Up @@ -1296,7 +1314,7 @@ def test_code_transformation_fixed_form(tmpdir, capsys, caplog):
caplog.clear()
# Check an unknown file extension gives a log message and fails for a
# fixed form input.
with caplog.at_level(logging.INFO):
with caplog.at_level(logging.INFO, logger="psyclone.generator"):
inputfile = str(tmpdir.join("fixed_form.1s2"))
with open(inputfile, "w", encoding='utf-8') as my_file:
my_file.write(code)
Expand Down Expand Up @@ -1358,7 +1376,7 @@ def test_code_transformation_parse_failure(tmpdir, caplog, capsys):
inputfile = str(tmpdir.join("funny_syntax.f90"))
with open(inputfile, "w", encoding='utf-8') as my_file:
my_file.write(code)
with caplog.at_level(logging.ERROR):
with caplog.at_level(logging.ERROR, logger="psyclone.generator"):
with pytest.raises(SystemExit):
code_transformation_mode(inputfile, None, None, False, False,
False)
Expand Down
2 changes: 1 addition & 1 deletion src/psyclone/tests/gocean1p0_config_test.py
Original file line number Diff line number Diff line change
Expand Up @@ -88,7 +88,7 @@ def test_command_line(capsys, caplog):
GOLoop._bounds_lookup = {}
# Check that --config with a parameter is accepted but logs a warning
# about the deprecated access_mapping entry.
with caplog.at_level(logging.WARN):
with caplog.at_level(logging.WARN, logger="psyclone.configuration"):
main(options+["--config", config_file, f90_file])
assert ("Configuration file contains an ACCESS_MAPPING entry. This is "
"deprecated" in caplog.text)
Expand Down
18 changes: 10 additions & 8 deletions src/psyclone/tests/parse/file_info_test.py
Original file line number Diff line number Diff line change
Expand Up @@ -50,6 +50,8 @@
end program main
"""

TEST_LOGGER = "psyclone.parse.file_info"


def test_file_info_constructor():
"""
Expand Down Expand Up @@ -84,7 +86,7 @@ def test_file_info_missing_file(caplog):

"""
finfo = FileInfo("missing.txt")
with caplog.at_level(logging.INFO):
with caplog.at_level(logging.INFO, logger=TEST_LOGGER):
with pytest.raises(FileNotFoundError) as err:
_ = finfo.get_source_code()
assert "Source file 'missing.txt': loading source code" in caplog.text
Expand All @@ -101,7 +103,7 @@ def test_file_info_cached_source_code(tmpdir, caplog):
with open(fname, "w", encoding="utf-8") as fout:
fout.write(content)
finfo = FileInfo(fname, cache_active=True)
with caplog.at_level(logging.INFO):
with caplog.at_level(logging.INFO, logger=TEST_LOGGER):
input1 = finfo.get_source_code()
assert input1 == content
assert "a_file.txt': loaded OK" in caplog.text
Expand All @@ -116,14 +118,14 @@ def test_file_info_cached_source_code(tmpdir, caplog):
assert finfo._cache_data_save is None

# Load fparser tree to start caching
with caplog.at_level(logging.INFO):
with caplog.at_level(logging.INFO, logger=TEST_LOGGER):
finfo.get_fparser_tree()
assert finfo._cache_data_save is not None
assert "No cache file '" in caplog.text
assert "Cache file updated with hashsum " in caplog.text

caplog.clear()
with caplog.at_level(logging.INFO):
with caplog.at_level(logging.INFO, logger=TEST_LOGGER):
finfo = FileInfo(fname, cache_active=True)
input1 = finfo.get_fparser_tree()
assert finfo._cache_data_load is not None
Expand Down Expand Up @@ -285,7 +287,7 @@ def test_file_info_load_from_cache_corrupted(tmpdir, caplog):
assert file_info._cache_data_save is None

# Load with damaged cache file
with caplog.at_level(logging.INFO):
with caplog.at_level(logging.INFO, logger=TEST_LOGGER):
psyir_node = file_info.get_psyir()
assert isinstance(psyir_node, Node)
assert "Error while reading cache file - ignoring: " in caplog.text
Expand Down Expand Up @@ -353,7 +355,7 @@ def test_file_info_source_changed(tmpdir, caplog):
file_info: FileInfo = FileInfo(filename, cache_active=True)

# Load, but not from cache
with caplog.at_level(logging.INFO):
with caplog.at_level(logging.INFO, logger=TEST_LOGGER):
psyir_node = file_info.get_psyir()
assert "Cache hashsum mismatch: source " in caplog.text
assert "Cache file updated with hashsum " in caplog.text
Expand Down Expand Up @@ -415,7 +417,7 @@ def test_file_info_cachefile_not_writable(tmpdir, caplog):
# If the psyir, hence, fparser tree is requested, creating
# the cache will fail, but the psyir node itself will
# still be returned.
with caplog.at_level(logging.WARN):
with caplog.at_level(logging.WARN, logger=TEST_LOGGER):
psyir_node = file_info.get_psyir()
assert isinstance(psyir_node, Node)
assert "Unable to write to cache file: " in caplog.text
Expand All @@ -438,7 +440,7 @@ def fun_exception(a, b):

monkeypatch.setattr("pickle.dump", fun_exception)

with caplog.at_level(logging.WARN):
with caplog.at_level(logging.WARN, logger=TEST_LOGGER):
psyir_node = file_info.get_psyir()
assert isinstance(psyir_node, Node)
assert "Error while storing cache data - ignoring:" in caplog.text
Expand Down
4 changes: 2 additions & 2 deletions src/psyclone/tests/parse/module_manager_test.py
Original file line number Diff line number Diff line change
Expand Up @@ -507,13 +507,13 @@ def test_mod_manager_load_all_module_trigger_error_module_read_twice(
mod_man.add_files("d1/a_mod.f90")

# Load all module infos
with caplog.at_level(logging.INFO):
with caplog.at_level(logging.INFO, logger="psyclone.parse.module_manager"):
mod_man.load_all_module_infos()

assert "Loading module information for file 'd1/a_mod.f90'" in caplog.text

# Doing this a 2nd time should not raise any error
with caplog.at_level(logging.INFO):
with caplog.at_level(logging.INFO, logger="psyclone.parse.module_manager"):
mod_man.load_all_module_infos()

assert "Module 'a_mod' already processed" in caplog.text
Expand Down
Loading
Loading