Skip to content

Conversation

@Neeratyoy
Copy link
Contributor

@Neeratyoy Neeratyoy commented Feb 23, 2021

Reference Issue

Addresses #248.

@Neeratyoy Neeratyoy changed the title [skip ci] addressing #248 Measuring runtimes Feb 23, 2021
@Neeratyoy Neeratyoy marked this pull request as ready for review February 24, 2021 13:58
@Neeratyoy Neeratyoy requested a review from mfeurer February 24, 2021 15:08
@Neeratyoy Neeratyoy requested a review from mfeurer February 25, 2021 12:53
Copy link
Collaborator

@mfeurer mfeurer left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good, except for the failure that you mentioned offline

@Neeratyoy Neeratyoy requested a review from mfeurer March 2, 2021 16:16
@Neeratyoy
Copy link
Contributor Author

for the failure that you mentioned offline

Should pass with the changes made.
For the long term, we can have a new PR that removes the flow parameter from _run_task_get_arffcontent's argument list?

Copy link
Collaborator

@mfeurer mfeurer left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think you can also remove the flow in this PR as it is not needed and removing it is helpful in reaching this PRs goal.

@Neeratyoy
Copy link
Contributor Author

Added a flaky decorator, and also followed some of the suggestions from here to test if that prevents, what clearly is a stochastic failure. Moreover, the old failure to attach entities are reappearing now randomly again 😕

@Neeratyoy Neeratyoy requested a review from mfeurer March 9, 2021 23:47
res = format_prediction(regression, *ignored_input)
self.assertListEqual(res, [0] * 5)

@pytest.mark.flaky() # appears to fail stochastically on test server
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are these errors only with loky? If so, we could use multiprocessing as a default here as we don't want to test for a specific backend in this unit test?

@mfeurer mfeurer merged commit f94672e into develop Mar 12, 2021
@mfeurer mfeurer deleted the runtimes branch March 12, 2021 15:38
PGijsbers pushed a commit to Mirkazemi/openml-python that referenced this pull request Feb 23, 2023
* [skip ci] addressing openml#248

* Unit test to test existence of refit time

* Refactoring unit test

* Fixing unit test failures

* Unit test fixing + removing redundant parameter

* Debugging stochastic failure of test_joblib_backends unit test

* Unit test fix with decorators

* Flaky for failing unit test

* Adding flaky reruns for unit tests

* Fixing setup big

* pytest rerun debug

* Fixing coverage failure

* Debugging coverage failure

* Debugging coverage failure

* Adding __init__ files in test/ for pytest-cov

* Debugging coverage failure

* Debugging lean unit test

* Debugging loky failure in unit tests

* Clean up of debugging stuff
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants