Freezegun's freeze_time throws odd transformers error when used
Asked Answered
A

1

5

I have a test that seems to throw an odd error when freeze_time is called. Reading the stacktrace, it's most likely interacting with another dependency in an odd way, but I don't have enough experience with python to understand what the problem may be.

self = <module 'transformers.models.transfo_xl' from '/Users/rbhalla/dev/second/server/.venv/lib/python3.10/site-packages/transformers/models/transfo_xl/__init__.py'>, module_name = 'tokenization_transfo_xl'

    def _get_module(self, module_name: str):
        try:
>           return importlib.import_module("." + module_name, self.__name__)

.venv/lib/python3.10/site-packages/transformers/utils/import_utils.py:905:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

name = '.tokenization_transfo_xl', package = 'transformers.models.transfo_xl'

    def import_module(name, package=None):
        """Import a module.

        The 'package' argument is required when performing a relative import. It
        specifies the package to use as the anchor point from which to resolve the
        relative import to an absolute import.

        """
        level = 0
        if name.startswith('.'):
            if not package:
                msg = ("the 'package' argument is required to perform a relative "
                       "import for {!r}")
                raise TypeError(msg.format(name))
            for character in name:
                if character != '.':
                    break
                level += 1
>       return _bootstrap._gcd_import(name[level:], package, level)

../../../.pyenv/versions/3.10.3/lib/python3.10/importlib/__init__.py:126:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

name = 'transformers.models.transfo_xl.tokenization_transfo_xl', package = 'transformers.models.transfo_xl', level = 1

>   ???

<frozen importlib._bootstrap>:1050:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

name = 'transformers.models.transfo_xl.tokenization_transfo_xl', import_ = <function _gcd_import at 0x10d54b400>

>   ???

<frozen importlib._bootstrap>:1027:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

name = 'transformers.models.transfo_xl.tokenization_transfo_xl', import_ = <function _gcd_import at 0x10d54b400>

>   ???

<frozen importlib._bootstrap>:1006:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

spec = ModuleSpec(name='transformers.models.transfo_xl.tokenization_transfo_xl', loader=<_frozen_importlib_external.SourceFil...bhalla/dev/second/server/.venv/lib/python3.10/site-packages/transformers/models/transfo_xl/tokenization_transfo_xl.py')

>   ???

<frozen importlib._bootstrap>:688:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <_frozen_importlib_external.SourceFileLoader object at 0x15c13db10>
module = <module 'transformers.models.transfo_xl.tokenization_transfo_xl' from '/Users/rbhalla/dev/second/server/.venv/lib/python3.10/site-packages/transformers/models/transfo_xl/tokenization_transfo_xl.py'>

>   ???

<frozen importlib._bootstrap_external>:883:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

f = <built-in function exec>
args = (<code object <module> at 0x15c185d10, file "/Users/rbhalla/dev/second/server/.venv/lib/python3.10/site-packages/trans...ns.Counter'>, 'List': typing.List, 'Optional': typing.Optional, 'OrderedDict': <class 'collections.OrderedDict'>, ...})
kwds = {}

>   ???

<frozen importlib._bootstrap>:241:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

    # coding=utf-8
    # Copyright 2018 Google AI, Google Brain and Carnegie Mellon University Authors and the HuggingFace Inc. team.
    # Copyright (c) 2018, NVIDIA CORPORATION.  All rights reserved.
    #
    # Licensed under the Apache License, Version 2.0 (the "License");
    # you may not use this file except in compliance with the License.
    # You may obtain a copy of the License at
    #
    #     http://www.apache.org/licenses/LICENSE-2.0
    #
    # Unless required by applicable law or agreed to in writing, software
    # distributed under the License is distributed on an "AS IS" BASIS,
    # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    # See the License for the specific language governing permissions and
    # limitations under the License.
    """
     Tokenization classes for Transformer XL model. Adapted from https://github.com/kimiyoung/transformer-xl.
    """


    import glob
    import os
    import pickle
    import re
    from collections import Counter, OrderedDict
    from typing import List, Optional, Tuple

    import numpy as np

>   import sacremoses as sm
E   ModuleNotFoundError: No module named 'sacremoses'

.venv/lib/python3.10/site-packages/transformers/models/transfo_xl/tokenization_transfo_xl.py:30: ModuleNotFoundError

The above exception was the direct cause of the following exception:

client = <starlette.testclient.TestClient object at 0x15b2a19c0>

    def test_time_weight(client):
        "Tests whether it favours more recent memories"

        memory_response_1 = client.post(
            "/remember", json={"type": "DIRECT", "text": "the car is at the back"}
        )

        assert memory_response_1.status_code == 200

        e2e_helper.consume_context_queue()

>       with freeze_time(datetime.now() - timedelta(minutes=60), tick=True):

src/e2e/test_questions.py:168:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
.venv/lib/python3.10/site-packages/freezegun/api.py:613: in __enter__
    return self.start()
.venv/lib/python3.10/site-packages/freezegun/api.py:702: in start
    module_attrs = _get_cached_module_attributes(module)
.venv/lib/python3.10/site-packages/freezegun/api.py:129: in _get_cached_module_attributes
    _setup_module_cache(module)
.venv/lib/python3.10/site-packages/freezegun/api.py:108: in _setup_module_cache
    all_module_attributes = _get_module_attributes(module)
.venv/lib/python3.10/site-packages/freezegun/api.py:97: in _get_module_attributes
    attribute_value = getattr(module, attribute_name)
.venv/lib/python3.10/site-packages/transformers/utils/import_utils.py:896: in __getattr__
    value = getattr(module, name)
.venv/lib/python3.10/site-packages/transformers/utils/import_utils.py:895: in __getattr__
    module = self._get_module(self._class_to_module[name])
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <module 'transformers.models.transfo_xl' from '/Users/rbhalla/dev/second/server/.venv/lib/python3.10/site-packages/transformers/models/transfo_xl/__init__.py'>, module_name = 'tokenization_transfo_xl'

    def _get_module(self, module_name: str):
        try:
            return importlib.import_module("." + module_name, self.__name__)
        except Exception as e:
>           raise RuntimeError(
                f"Failed to import {self.__name__}.{module_name} because of the following error (look up to see its"
                f" traceback):\n{e}"
            ) from e
E           RuntimeError: Failed to import transformers.models.transfo_xl.tokenization_transfo_xl because of the following error (look up to see its traceback):
E           No module named 'sacremoses'

.venv/lib/python3.10/site-packages/transformers/utils/import_utils.py:907: RuntimeError

The libraries that seem to be involved, transfo_xl and sacremoses are not being referenced anywhere in my code base. It's possible another dependency is, but again, these errors are not present unless freeze_time is called, making me wonder why this is happening.

It's possible I could just install sacremoses, but given it's not a dependency of my project, I'd rather solve the root problem here.

Anyone see something obvious I am missing? Or is anyone able to recommend a library that achieves the same functionality as freeze_time?

Edit, it seems like installing pandas modifies the error slightly:

self = <module 'transformers.models.tapas' from '/Users/rbhalla/dev/second/server/.venv/lib/python3.10/site-packages/transformers/models/tapas/__init__.py'>
module_name = 'tokenization_tapas'

    def _get_module(self, module_name: str):
        try:
>           return importlib.import_module("." + module_name, self.__name__)

.venv/lib/python3.10/site-packages/transformers/utils/import_utils.py:905:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

name = '.tokenization_tapas', package = 'transformers.models.tapas'

    def import_module(name, package=None):
        """Import a module.

        The 'package' argument is required when performing a relative import. It
        specifies the package to use as the anchor point from which to resolve the
        relative import to an absolute import.

        """
        level = 0
        if name.startswith('.'):
            if not package:
                msg = ("the 'package' argument is required to perform a relative "
                       "import for {!r}")
                raise TypeError(msg.format(name))
            for character in name:
                if character != '.':
                    break
                level += 1
>       return _bootstrap._gcd_import(name[level:], package, level)

../../../.pyenv/versions/3.10.3/lib/python3.10/importlib/__init__.py:126:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

name = 'transformers.models.tapas.tokenization_tapas', package = 'transformers.models.tapas', level = 1

>   ???

<frozen importlib._bootstrap>:1050:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

name = 'transformers.models.tapas.tokenization_tapas', import_ = <function _gcd_import at 0x10b4c3400>

>   ???

<frozen importlib._bootstrap>:1027:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

name = 'transformers.models.tapas.tokenization_tapas', import_ = <function _gcd_import at 0x10b4c3400>

>   ???

<frozen importlib._bootstrap>:1006:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

spec = ModuleSpec(name='transformers.models.tapas.tokenization_tapas', loader=<_frozen_importlib_external.SourceFileLoader ob...='/Users/rbhalla/dev/second/server/.venv/lib/python3.10/site-packages/transformers/models/tapas/tokenization_tapas.py')

>   ???

<frozen importlib._bootstrap>:688:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <_frozen_importlib_external.SourceFileLoader object at 0x15a96f850>
module = <module 'transformers.models.tapas.tokenization_tapas' from '/Users/rbhalla/dev/second/server/.venv/lib/python3.10/site-packages/transformers/models/tapas/tokenization_tapas.py'>

>   ???

<frozen importlib._bootstrap_external>:883:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

f = <built-in function exec>
args = (<code object <module> at 0x15a9ac3a0, file "/Users/rbhalla/dev/second/server/.venv/lib/python3.10/site-packages/trans... `'pt'`: Return PyTorch `torch.Tensor` objects.\n                - `'np'`: Return Numpy `np.ndarray` objects.\n", ...})
kwds = {}

>   ???

<frozen importlib._bootstrap>:241:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

    # coding=utf-8
    # Copyright 2020 Google Research and The HuggingFace Inc. team.
    #
    # Licensed under the Apache License, Version 2.0 (the "License");
    # you may not use this file except in compliance with the License.
    # You may obtain a copy of the License at
    #
    #     http://www.apache.org/licenses/LICENSE-2.0
    #
    # Unless required by applicable law or agreed to in writing, software
    # distributed under the License is distributed on an "AS IS" BASIS,
    # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    # See the License for the specific language governing permissions and
    # limitations under the License.
    """ Tokenization class for TAPAS model."""


    import collections
    import datetime
    import enum
    import itertools
    import math
    import os
    import re
    import unicodedata
    from dataclasses import dataclass
    from typing import Callable, Dict, Generator, List, Optional, Text, Tuple, Union

    import numpy as np

    from ...tokenization_utils import PreTrainedTokenizer, _is_control, _is_punctuation, _is_whitespace
    from ...tokenization_utils_base import (
        ENCODE_KWARGS_DOCSTRING,
        BatchEncoding,
        EncodedInput,
        PreTokenizedInput,
        TextInput,
    )
    from ...utils import ExplicitEnum, PaddingStrategy, TensorType, add_end_docstrings, is_pandas_available, logging


    if is_pandas_available():
>       import pandas as pd

.venv/lib/python3.10/site-packages/transformers/models/tapas/tokenization_tapas.py:43:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

    # flake8: noqa

    __docformat__ = "restructuredtext"

    # Let users know if they're missing any of our hard dependencies
    hard_dependencies = ("numpy", "pytz", "dateutil")
    missing_dependencies = []

    for dependency in hard_dependencies:
        try:
            __import__(dependency)
        except ImportError as e:
            missing_dependencies.append(f"{dependency}: {e}")

    if missing_dependencies:
        raise ImportError(
            "Unable to import required dependencies:\n" + "\n".join(missing_dependencies)
        )
    del hard_dependencies, dependency, missing_dependencies

    # numpy compat
>   from pandas.compat import is_numpy_dev as _is_numpy_dev

.venv/lib/python3.10/site-packages/pandas/__init__.py:22:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

    """
    compat
    ======

    Cross-compatible functions for different versions of Python.

    Other items:
    * platform checker
    """
    import os
    import platform
    import sys

    from pandas._typing import F
>   from pandas.compat.numpy import (
        is_numpy_dev,
        np_version_under1p19,
        np_version_under1p20,
    )

.venv/lib/python3.10/site-packages/pandas/compat/__init__.py:15:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

    """ support numpy compatibility across versions """
    import numpy as np

>   from pandas.util.version import Version

.venv/lib/python3.10/site-packages/pandas/compat/numpy/__init__.py:4:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

>   from pandas.util._decorators import (  # noqa:F401
        Appender,
        Substitution,
        cache_readonly,
    )

.venv/lib/python3.10/site-packages/pandas/util/__init__.py:1:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

    from __future__ import annotations

    from functools import wraps
    import inspect
    from textwrap import dedent
    from typing import (
        Any,
        Callable,
        Mapping,
        cast,
    )
    import warnings

>   from pandas._libs.properties import cache_readonly  # noqa:F401

.venv/lib/python3.10/site-packages/pandas/util/_decorators.py:14:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

    __all__ = [
        "NaT",
        "NaTType",
        "OutOfBoundsDatetime",
        "Period",
        "Timedelta",
        "Timestamp",
        "iNaT",
        "Interval",
    ]


>   from pandas._libs.interval import Interval

.venv/lib/python3.10/site-packages/pandas/_libs/__init__.py:13:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

>   ???

pandas/_libs/interval.pyx:1:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

>   ???

pandas/_libs/hashtable.pyx:1:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

>   ???

pandas/_libs/missing.pyx:1:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

    __all__ = [
        "dtypes",
        "localize_pydatetime",
        "NaT",
        "NaTType",
        "iNaT",
        "nat_strings",
        "OutOfBoundsDatetime",
        "OutOfBoundsTimedelta",
        "IncompatibleFrequency",
        "Period",
        "Resolution",
        "Timedelta",
        "normalize_i8_timestamps",
        "is_date_array_normalized",
        "dt64arr_to_periodarr",
        "delta_to_nanoseconds",
        "ints_to_pydatetime",
        "ints_to_pytimedelta",
        "get_resolution",
        "Timestamp",
        "tz_convert_from_utc_single",
        "to_offset",
        "Tick",
        "BaseOffset",
        "tz_compare",
    ]

    from pandas._libs.tslibs import dtypes
>   from pandas._libs.tslibs.conversion import (
        OutOfBoundsTimedelta,
        localize_pydatetime,
    )

.venv/lib/python3.10/site-packages/pandas/_libs/tslibs/__init__.py:30:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

>   ???

pandas/_libs/tslibs/conversion.pyx:1:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

>   ???
E   TypeError: type 'pandas._libs.tslibs.base.ABCTimestamp' is not dynamically allocated but its base type 'FakeDatetime' is dynamically allocated

pandas/_libs/tslibs/base.pyx:1: TypeError

The above exception was the direct cause of the following exception:

client = <starlette.testclient.TestClient object at 0x159946f20>

    def test_time_weight(client):
        "Tests whether it favours more recent memories"

        memory_response_1 = client.post(
            "/remember", json={"type": "DIRECT", "text": "the car is at the back"}
        )

        assert memory_response_1.status_code == 200

        e2e_helper.consume_context_queue()

>       with freeze_time(datetime.now() - timedelta(minutes=60), tick=True):

src/e2e/test_questions.py:168:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
.venv/lib/python3.10/site-packages/freezegun/api.py:633: in __enter__
    return self.start()
.venv/lib/python3.10/site-packages/freezegun/api.py:722: in start
    module_attrs = _get_cached_module_attributes(module)
.venv/lib/python3.10/site-packages/freezegun/api.py:129: in _get_cached_module_attributes
    _setup_module_cache(module)
.venv/lib/python3.10/site-packages/freezegun/api.py:108: in _setup_module_cache
    all_module_attributes = _get_module_attributes(module)
.venv/lib/python3.10/site-packages/freezegun/api.py:97: in _get_module_attributes
    attribute_value = getattr(module, attribute_name)
.venv/lib/python3.10/site-packages/transformers/utils/import_utils.py:896: in __getattr__
    value = getattr(module, name)
.venv/lib/python3.10/site-packages/transformers/utils/import_utils.py:895: in __getattr__
    module = self._get_module(self._class_to_module[name])
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <module 'transformers.models.tapas' from '/Users/rbhalla/dev/second/server/.venv/lib/python3.10/site-packages/transformers/models/tapas/__init__.py'>
module_name = 'tokenization_tapas'

    def _get_module(self, module_name: str):
        try:
            return importlib.import_module("." + module_name, self.__name__)
        except Exception as e:
>           raise RuntimeError(
                f"Failed to import {self.__name__}.{module_name} because of the following error (look up to see its"
                f" traceback):\n{e}"
            ) from e
E           RuntimeError: Failed to import transformers.models.tapas.tokenization_tapas because of the following error (look up to see its traceback):
E           type 'pandas._libs.tslibs.base.ABCTimestamp' is not dynamically allocated but its base type 'FakeDatetime' is dynamically allocated

.venv/lib/python3.10/site-packages/transformers/utils/import_utils.py:907: RuntimeError
Avelin answered 16/7, 2022 at 20:20 Comment(2)
can you show the test definition? it seems that sacremoses is a dependency of testing the transformers library as seen here --> github.com/huggingface/transformers/blob/… which then is used in other extras installationsInsipid
Apologies for the late reply @Insipid and thanks for the interesting insight. I'm happy to show the test case, however it is an integration test that mocks a FastAPI client. So the test case is effectively, call an endpoint, freezetime, call same endpoint. I've updated the question above with something new I found. The error message seems to change when pandas is installed. I'm not sure if this gives any other hints I'm unable to parse.Avelin
A
9

This is what I needed to add to fix this problem:

freezegun.configure(extend_ignore_list=['transformers'])

If anyone could explain why this was needed, I would appreciate it.

Avelin answered 5/9, 2022 at 20:49 Comment(1)
To me it was never happening with transformers=4.25.1, but started to happen with later versions.Shanel

© 2022 - 2024 — McMap. All rights reserved.