eslint-plugin-ava: add `failing-issue-link` rule

If a user adds a failing test, they should be required to add a failing issue link. We discussed adding that to the AVA api, but it would be just as easy to enforce that users add a comment in the vicinity of the failing test, documenting the failure.

I have some ideas for options (not all of them are necessarily good), so let’s bikeshed:

Position Option

The rule should have a position option, with available values being: above | within | both, I don’t see the value of a below option, but I wouldn’t fight it’s inclusion.

// position:above 
// https://github.com/sindresorhus/eslint-plugin-ava/issues/108
test.failing(t => {});

// position: within
test.failing(t => {
  // https://github.com/sindresorhus/eslint-plugin-ava/issues/108
});

Link Style Option

Allow users to configure the required GitHub link style:

  • full: https://github.com/sindresorhus/eslint-plugin-ava/issues/108 (https:// should be optional)
  • short: sindresorhus/eslint-plugin-ava#108
  • hash: #108
  • any: equivalent to full|hash|short

Full urls would always be accepted if they link outside of GH.

Short urls would be accepted in hash mode, if they link outside the current projects repo (which we could figure out by polling git remotes? Or just make it a config option?).

Issue Regexp Option

If they aren’t using GitHub, then the link style option doesn’t help much. Instead allow them to configure a list of regular expressions for what is allowed.

About this issue

  • Original URL
  • State: open
  • Created 8 years ago
  • Comments: 18 (18 by maintainers)

Most upvoted comments

Nice! You should really just create a catalog of your modules - I miss so many trying to use npm search

Would iTerm2 let you click that?

Yes, just tried.

I was thinking it might just end up being more clutter.

I think this one of the times we want clutter. People should ideally be annoyed by it and fix the issues, like with TODO warnings in ESLint. Having an easily clickable link to the issue makes it more likely some random contributor running the tests will see it in decide to fix it. Just a thought.