Skip to content

Commit

Permalink
Patch OpenLLaMa tokenizer imports for transformers 4.29 compatibility (
Browse files Browse the repository at this point in the history
…#1315)

Summary:
X-link: fairinternal/mmf-internal#206

Pull Request resolved: #1315

OpenLLaMa tokenizer maps the import path to read from LLaMa's tokenizer; this is incompatible with MMF's transformers patch. Wrap module imports in a try/except block to fix these import errors

Reviewed By: RylanC24, ankitade

Differential Revision: D46283948

fbshipit-source-id: 8769b837d3ef78b31986587e203e6b48d005082b
  • Loading branch information
ebsmothers authored and facebook-github-bot committed Jun 1, 2023
1 parent def19b7 commit 536f503
Showing 1 changed file with 7 additions and 7 deletions.
14 changes: 7 additions & 7 deletions mmf/utils/patch.py
Original file line number Diff line number Diff line change
Expand Up @@ -52,10 +52,6 @@ def patch_transformers(log_incompatible=False):
if key.startswith("__"):
continue

# Avoid LLaMa tokenizers 0.13.3 > 0.12.1 requirement as we're not using LLaMa
if key.startswith("llama"):
continue

model_lib = importlib.import_module(f"transformers.models.{key}")
if not hasattr(model_lib, "_modules"):
if log_incompatible:
Expand All @@ -68,9 +64,13 @@ def patch_transformers(log_incompatible=False):
for module in model_lib._modules:
if not module or module == "." or module[0] == ".":
continue
sys.modules[f"transformers.{module}"] = importlib.import_module(
f"transformers.models.{key}.{module}"
)
try:
sys.modules[f"transformers.{module}"] = importlib.import_module(
f"transformers.models.{key}.{module}"
)
except ImportError:
logger.info(f"Failed to import transformers.models.{key}.{module}")
continue
sys.path = [sys.path[-1]] + sys.path[:-1]


Expand Down

0 comments on commit 536f503

Please sign in to comment.