-
Notifications
You must be signed in to change notification settings - Fork 1.1k
PYTHON-5055 - Convert test_client.py to unittest #2074
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…meout on server selection
evergreen retry |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think the Python changes look good to me. (I found things that I needed to change on my PR / test conversions as a result of reviewing this so that was helpful haha)
The failing |
.evergreen/run-tests.sh
Outdated
uv run ${UV_ARGS[*]} pytest "${ASYNC_PYTEST_ARGS[@]}" "--collect-only" | ||
collected=$? | ||
set -o errexit | ||
# If we collected at least one async test, run all collected tests |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What's the benefit using the "collect then run" approach? Wouldn't it be simpler to run pytest even if all tests are skipped?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good point. My original intention was to avoid the error-handling that would require, but this approach doesn't avoid that either.
test/utils.py
Outdated
@@ -826,7 +827,7 @@ def lazy_client_trial(reset, target, test, get_client): | |||
with frequent_thread_switches(): | |||
for _i in range(NTRIALS): | |||
reset(collection) | |||
lazy_client = get_client() | |||
lazy_client = client |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think this test is the same anymore. It's supposed to create a new client each iteration but this new code looks to reuse the same client.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good catch, fixed.
test/asynchronous/pymongo_mocks.py
Outdated
@@ -166,7 +166,8 @@ async def get_async_mock_client( | |||
standalones, members, mongoses, hello_hosts, arbiters, down_hosts, *args, **kwargs | |||
) | |||
|
|||
await c.aconnect() | |||
if "connect" not in kwargs or "connect" in kwargs and kwargs["connect"]: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Suggest if kwargs.get("connect", True):
|
||
|
||
@pytest.mark.usefixtures("require_integration") | ||
@pytest.mark.integration |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there a simple way to avoid needing to mark the test as "integration" twice?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@pytest.mark.usefixtures("require_integration")
isn't actually applying a mark, it's calling the require_integration
as a requirement for the decorated test.
pytest -m require_integration
won't select any tests, but pytest -m integration
will.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see this comment from the PR description is related:
unit and integration pytest marks have been added. Going forward, all test classes should be marked with the appropriate one.
If integration tests need to be tagged with require_integration
, what's the benefit of using the unit vs integration marks?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Using the integration
and unit
marks allows us to run only that set of tests. If we want to add additional subsets of tests that cross suites in the future, having those marks will be a good first step.
PYTEST_ARGS="-m $TEST_SUITES $PYTEST_ARGS" | ||
# Workaround until unittest -> pytest conversion is complete | ||
if [[ "$TEST_SUITES" == *"default_async"* ]]; then | ||
ASYNC_PYTEST_ARGS=("-m asyncio" "--junitxml=xunit-results/TEST-asyncresults.xml" "${PYTEST_ARGS[@]}") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What about coverage? Does that pick up both test runs automatically?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This shows it's not working as expected. The second coverage run will overwrite the .coverage
output file by default: https://coverage.readthedocs.io/en/7.6.10/cmd.html#data-file
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good catch, does it make sense to use --append
on the second run to keep the single file?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The alternative would be to use a separate file with COVERAGE_FILE
and then coverage combine
the two.
Main points of note:
unittest
methods have been converted topytest
fixtures.scope
instead ofloop_scope
.ClientContext
has been kept as a metadata store for our tests served by a session-scoped fixture.unit
andintegration
pytest marks have been added. Going forward, all test classes should be marked with the appropriate one.