Skip to content

[Bug] 01-Atom-7B-chat-WebDemo-缺少flash-attn依赖 #440

@yzw3270978316

Description

@yzw3270978316

出bug的具体模型

Atom-7B-chat

出bug的具体模型教程

01-Atom-7B-chat-WebDemo

教程负责人

@KMnO4-zx

Bug描述

对应教程安装后,报错flash_attn安装失败,

复现步骤

对应教程安装后,报错flash_attn安装失败,

期望行为

需要手动下载如下链接的flash-attn版本,本地手动安装

https://github.com/Dao-AILab/flash-attention/releases/download/v2.6.3/flash_attn-2.6.3+cu118torch2.0cxx11abiFALSE-cp38-cp38-linux_x86_64.whl

环境信息

PyTorch 2.0.0
python 3.8
ubuntu 20.04
cuda 11.8

其他信息

需要手动下载如下链接的flash-attn版本,本地手动安装

https://github.com/Dao-AILab/flash-attention/releases/download/v2.6.3/flash_attn-2.6.3+cu118torch2.0cxx11abiFALSE-cp38-cp38-linux_x86_64.whl

确认事项 / Verification

  • 此问题未在过往Issue中被报告过 / This issue hasn't been reported before

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions