Skip to content

Add flaky and pytest-benchmark installations to scripts/eachdist.py develop #2772

@jeremydvoss

Description

@jeremydvoss

Describe your environment Python 3.10. Linux and Windows.

Steps to reproduce
Follow CONTRIBUTING steps on a clean environment. Upon running the "scripts/eachdist.py test" script, notice that flaky needs to be installed. "pip install flaky" and run the tests again. Notice that pytest-benchmark needs to be installed. "pip install pytest-benchmark" then run tests again. They pass (albeit with some warnings) on linux. I am personally still facing issues on Windows.

What is the expected behavior?
After running "scripts/eachdist.py test", "scripts/eachdist.py test" should pass without any missing modules.

What is the actual behavior?
Instead, I get the following before installing flaky:

========================================== warnings summary ==========================================  

opentelemetry-api\tests\trace\test_globals.py:10 

  C:\Users\jeremyvoss\workplace\opentelemetry-python\opentelemetry-api\tests\trace\test_globals.py:10: PytestCollectionWarning: cannot collect test class 'TestSpan' because it has a __init__ constructor (from: opentelemetry-api/tests/trace/test_globals.py) 

    class TestSpan(trace.NonRecordingSpan): 

  

    from flaky import flaky 

E   ModuleNotFoundError: No module named 'flaky'========================================== warnings summary ========================================== opentelemetry-sdk\src\opentelemetry\sdk\trace\__init__.py:1163 

  c:\users\jeremyvoss\workplace\opentelemetry-python\opentelemetry-sdk\src\opentelemetry\sdk\trace\__init__.py:1163: DeprecationWarning: Call to deprecated method __init__. (You should use InstrumentationScope) -- Deprecated since version 1.11.1. 

    InstrumentationInfo( 

  

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 

!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! Interrupted: 1 error during collection !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! ==================================== 1 warning, 1 error in 0.62s =====================================  

'pytest C:/Users/jeremyvoss/workplace/opentelemetry-python/opentelemetry-sdk/tests ' failed with code 2 

After installing flaky, I get the following before installing pytest-benchmark:

SKIPPED [1] opentelemetry-sdk\tests\logs\test_export.py:274: needs *nix 

SKIPPED [1] opentelemetry-sdk\tests\metrics\integration_test\test_time_align.py:149: test failing in CI when run in Windows 

SKIPPED [1] opentelemetry-sdk\tests\trace\export\test_export.py:370: needs *nix 

======================= 334 passed, 3 skipped, 96 warnings, 2 errors in 12.14s =======================  

Exception while exporting metrics I/O operation on closed file. 

Traceback (most recent call last): 

  File "c:\users\jeremyvoss\workplace\opentelemetry-python\opentelemetry-sdk\src\opentelemetry\sdk\metrics\_internal\export\__init__.py", line 412, in _receive_metrics 

    self._exporter.export(metrics_data, timeout_millis=timeout_millis) 

  File "c:\users\jeremyvoss\workplace\opentelemetry-python\opentelemetry-sdk\src\opentelemetry\sdk\metrics\_internal\export\__init__.py", line 121, in export 

    self.out.write(self.formatter(metrics_data)) 

ValueError: I/O operation on closed file. 

Can't shutdown multiple times 

Exception ignored in atexit callback: <bound method MeterProvider.shutdown of <opentelemetry.sdk.metrics.MeterProvider object at 0x0000024441FDBFA0>> 

Traceback (most recent call last): 

  File "c:\users\jeremyvoss\workplace\opentelemetry-python\opentelemetry-sdk\src\opentelemetry\sdk\metrics\_internal\__init__.py", line 417, in shutdown 

    did_shutdown = self._shutdown_once.do_once(_shutdown) 

AttributeError: 'MeterProvider' object has no attribute '_shutdown_once' 

'pytest C:/Users/jeremyvoss/workplace/opentelemetry-python/opentelemetry-sdk/tests ' failed with code 1 

 

ERRORS: 

file C:\Users\jeremyvoss\workplace\opentelemetry-python\opentelemetry-sdk\tests\performance\benchmarks\trace\test_benchmark_trace.py, line 31 

  def test_simple_start_span(benchmark): 

E       fixture 'benchmark' not found 

Additional context
As noted above, I think there is still some work needed to make the CONTRIBUTING document fully applicable to Windows. It seems to me that following the instructions on Windows is yields yet another issue that I am still investigating.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingdiscussionIssue or PR that needs/is extended discussion.docDocumentation-related

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions