flash-attention: Compiler error

I tried compiling the latest build, but setup.py crashed with the following error:

/mnt/ceph/users/dberenberg/gustaf_stuff/repos/flash-attention/csrc/flash_attn/src/fmha_fprop_fp16_kernel.sm80.cu:62:385: internal compiler error: in maybe_undo_parenthesized_ref, at cp/semantics.c:1740
     BOOL_SWITCH(launch_params.is_dropout, IsDropoutConst, [&] {

About this issue

  • Original URL
  • State: closed
  • Created 2 years ago
  • Comments: 18 (12 by maintainers)

Commits related to this issue

Most upvoted comments

Also thanks for your great work! FlashAttention helps improve my project so much!

Thanks so much @syorami. Looks like the compiler issue is fixed, so I’m closing the issue for now. The linking problem seems to be a separate issue with torch path and conda. Feel free to reopen if you run into problems.

That’s interesting. Let me set up an environment with gcc 7 to test things more thoroughly.