Description
Checks
- This template is only for bug reports, usage problems go with 'Help Wanted'.
- I have thoroughly reviewed the project documentation but couldn't find information to solve my problem.
- I have searched for existing issues, including closed ones, and couldn't find a solution.
- I am using English to submit this issue to facilitate community communication.
Environment Details
ubuntu22.04 pip install F5-TTS, When I use the Qwen/Qwen2.5-3B-Instruct model, the error message is as follows:
Traceback (most recent call last):
File "/home/suwei/.conda/envs/f5-tts/lib/python3.11/site-packages/gradio/queueing.py", line 625, in process_events
response = await route_utils.call_process_api(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/suwei/.conda/envs/f5-tts/lib/python3.11/site-packages/gradio/route_utils.py", line 322, in call_process_api
output = await app.get_blocks().process_api(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/suwei/.conda/envs/f5-tts/lib/python3.11/site-packages/gradio/blocks.py", line 2051, in process_api
result = await self.call_function(
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/suwei/.conda/envs/f5-tts/lib/python3.11/site-packages/gradio/blocks.py", line 1598, in call_function
prediction = await anyio.to_thread.run_sync( # type: ignore
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/suwei/.conda/envs/f5-tts/lib/python3.11/site-packages/anyio/to_thread.py", line 56, in run_sync
return await get_async_backend().run_sync_in_worker_thread(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/suwei/.conda/envs/f5-tts/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 2470, in run_sync_in_worker_thread
return await future
^^^^^^^^^^^^
File "/home/suwei/.conda/envs/f5-tts/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 967, in run
result = context.run(func, *args)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/suwei/.conda/envs/f5-tts/lib/python3.11/site-packages/gradio/utils.py", line 883, in wrapper
response = f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
File "/home/suwei/.conda/envs/f5-tts/lib/python3.11/site-packages/f5_tts/infer/infer_gradio.py", line 756, in load_chat_model
chat_model_state = AutoModelForCausalLM.from_pretrained(chat_model_name, torch_dtype="auto", device_map="auto")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/suwei/.conda/envs/f5-tts/lib/python3.11/site-packages/transformers/models/auto/auto_factory.py", line 571, in from_pretrained
return model_class.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/suwei/.conda/envs/f5-tts/lib/python3.11/site-packages/transformers/modeling_utils.py", line 309, in _wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/home/suwei/.conda/envs/f5-tts/lib/python3.11/site-packages/transformers/modeling_utils.py", line 4508, in from_pretrained
model = cls(config, *model_args, **model_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/suwei/.conda/envs/f5-tts/lib/python3.11/site-packages/transformers/models/phi3/modeling_phi3.py", line 675, in init
self.model = Phi3Model(config)
^^^^^^^^^^^^^^^^^
File "/home/suwei/.conda/envs/f5-tts/lib/python3.11/site-packages/transformers/models/phi3/modeling_phi3.py", line 404, in init
self.post_init()
File "/home/suwei/.conda/envs/f5-tts/lib/python3.11/site-packages/transformers/modeling_utils.py", line 1969, in post_init
if v not in ALL_PARALLEL_STYLES:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: argument of type 'NoneType' is not iterable
Steps to Reproduce
conda create -n f5-tts python=3.11
conda activate f5-tts
pip install torch==2.4.0+cu124 torchaudio==2.4.0+cu124 --extra-index-url https://download.pytorch.org/whl/cu124
pip install f5-tts
✔️ Expected Behavior
No response
❌ Actual Behavior
No response