Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix test failures on windows #206

Merged
merged 14 commits into from
Feb 7, 2025

Conversation

studyingegret
Copy link
Contributor

test_fallback.py uses simple string compare (through dict) to simulate files. On Windows, paths are formed with \ instead of /, which causes inconsistency in the string compare, producing the failures below. (Specifically, the \s were created by os.path.join in FluentResourceLoader.)

Without this fix, running python -m unittest discover -s fluent.runtime on Windows on df5ef40 produced the following failures: (Some folders in paths omitted)

....................................................................................................................................................E.F.F.D:\***\python-fluent\fluent.runtime\fluent\runtime\types.py:361: UserWarning: FluentDateType option hour12 is not yet supported
  warnings.warn(f"FluentDateType option {k} is not yet supported")
................s...............
======================================================================
ERROR: test_bundles (tests.test_fallback.TestLocalization.test_bundles)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "D:\***\Python\Python313\Lib\unittest\mock.py", line 1424, in patched
    return func(*newargs, **newkeywargs)
  File "D:\***\python-fluent\fluent.runtime\tests\test_fallback.py", line 34, in test_bundles
    bundle_de = next(bundles_gen)
StopIteration

======================================================================
FAIL: test_all_exist (tests.test_fallback.TestResourceLoader.test_all_exist)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "D:\***\Python\Python313\Lib\unittest\mock.py", line 1424, in patched
    return func(*newargs, **newkeywargs)
  File "D:\***\python-fluent\fluent.runtime\tests\test_fallback.py", line 64, in test_all_exist
    self.assertEqual(len(resources_list), 1)
    ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^
AssertionError: 0 != 1

======================================================================
FAIL: test_one_exists (tests.test_fallback.TestResourceLoader.test_one_exists)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "D:\***\Python\Python313\Lib\unittest\mock.py", line 1424, in patched
    return func(*newargs, **newkeywargs)
  File "D:\***\python-fluent\fluent.runtime\tests\test_fallback.py", line 76, in test_one_exists
    self.assertEqual(len(resources_list), 1)
    ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^
AssertionError: 0 != 1

----------------------------------------------------------------------
Ran 186 tests in 0.325s

FAILED (failures=2, errors=1, skipped=1)

Note: This pull request has lint errors, but I don't know how to fix them.

The test failures on Windows (which are not present on Linux) are caused by the test code using simple string compare (through dict) when simulating files. On Windows, paths are formed with "\" instead of "/", which causes inconsistency in the string compare. (Specifically, the "\" were inserted by os.path.join in FluentResourceLoader.)
@studyingegret
Copy link
Contributor Author

studyingegret commented Feb 1, 2025

@eemeli I see you ran ci task and the test job failed. This is a known issue and it is because Python 3.7 reached end of life so it is no longer supported on ubuntu-latest. I used this workaround in another branch of my fork:

test:
  # Workaround Python 3.7 no longer supported https://github.com/actions/setup-python/issues/962#issuecomment-2414454450
  #runs-on: ubuntu-latest
  runs-on: ubuntu-22.04

@studyingegret studyingegret requested a review from eemeli February 6, 2025 05:07
Copy link
Member

@eemeli eemeli left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

See inline for a few final simplifications.

My point was that it was the file simulation code that caused the test errors on a different platform, so it makes sense to test the file simulation code.

Adding the tests is fine. I meant that as the existing test_fallback.py tests end up using the mocking, those tests can only succeed if the mocking works as expected. Therefore, they are also implicitly testing the mocking.

Note: In 457d698, it is my opinionated change to no longer reflect files on disk (code), because it seems unnecessary and unused* in current tests, and may make it difficult when you want to simulate some files don't exist and there are real files which names collide with them. I'd be happy to know what you think about that.

Yeah, dropping that seems like a good idea.

Sorry about being clumsy, this is my first PR.

No worries, welcome to open source! Honestly, if you'd not mentioned that this was your first PR ever, I'd have presumed you to be much more experienced.

I'd be very happy to review further work from you here, or elsewhere.

@studyingegret studyingegret requested a review from eemeli February 6, 2025 09:43
Copy link
Member

@eemeli eemeli left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for your contribution!

@eemeli eemeli merged commit fbf8b3f into projectfluent:main Feb 7, 2025
21 checks passed
studyingegret added a commit to studyingegret/python-fluent that referenced this pull request Feb 21, 2025
The comment was my personal exclamation when learning the code,
and now it seems more confusing than helpful. :) It is present in
projectfluent#206.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants