cli: Error in buildTypeScript: A project cannot be used in two compilations at the same time

I’m submitting a bug report

  • Library Version: 0.18.0 (CLI)

Please tell us about your environment:

  • Operating System: OSX 10.11.6
  • Node Version: 6.1.0
  • NPM Version: 3.8.6
  • Browser: all
  • Language: TypeScript 1.8.10

Current behavior: Frequently, when running au run --watch the TypeScript compile seems to hang.

In the Activity Monitor I can see that the aurelia process suddenly keeps using lots of CPU:

activity_monitor__all_processes_

Also, the Terminal window seems to hang. Meaning I’m unable to kill the process with Ctrl+C, where it leads to a long stacktrace like:

File Changed: src/views/devices/device-overview.ts
Starting 'readProjectConfiguration'...
Starting 'readProjectConfiguration'...
Finished 'readProjectConfiguration'
Finished 'readProjectConfiguration'
Starting 'processMarkup'...
Starting 'processCSS'...
Starting 'processMarkup'...
Starting 'processCSS'...
Starting 'configureEnvironment'...
Starting 'configureEnvironment'...
Finished 'configureEnvironment'
Starting 'buildTypeScript'...
Finished 'configureEnvironment'
Starting 'buildTypeScript'...
{ uid: 292,
  name: 'buildTypeScript',
  branch: false,
  error:
   { Error: gulp-typescript: A project cannot be used in two compilations at the same time. Create multiple projects with createProject instead.
       at compile (/Users/Hanssens/Work/git/xxx/app/node_modules/gulp-typescript/release/main.js:72:19)
       at buildTypeScript (/Users/Hanssens/Work/git/xxx/app/aurelia_project/tasks/transpile.ts:32:15)
       at bound (domain.js:280:14)
       at runBound (domain.js:293:12)
       at asyncRunner (/Users/Hanssens/Work/git/xxx/app/node_modules/async-done/index.js:36:18)
       at _combinedTickCallback (internal/process/next_tick.js:67:7)
       at process._tickDomainCallback (internal/process/next_tick.js:122:9)
     domain:
      Domain {
        domain: null,
        _events: {},
        _eventsCount: 0,
        _maxListeners: undefined,
        members: [] },
     domainThrown: true },
  duration: [ 0, 1050860 ],
  time: 1472216741812 }

In order to resolve this, I need to manually force quit the aurelia process in the Activity Monitor.

Expected/desired behavior: Expected behaviour is that it would not crash. Steps that can reproduce it:

  1. Start Visual Studio Code
  2. Add several .ts files
  3. Frequently edit and save the .ts files at random

In two projects I can reproduce this about every 10~15 minutes or so. Depending on how many changes I make, of course.

  • What is the expected behavior? Expected behaviour is NOT a crash.
  • What is the motivation / use case for changing the behavior? Fixing the crash.

About this issue

  • Original URL
  • State: closed
  • Created 8 years ago
  • Reactions: 3
  • Comments: 34 (13 by maintainers)

Commits related to this issue

Most upvoted comments

@EisenbergEffect I decided to take that challenge 😃, and spent some more hours today looking into this. I think I found a very promising, comprehensive solution to the problem that not only gets rid of the crashes, but also introduces some tremendous performance improvements for the watcher scenario. Let me give you an idea of what I’m talking about:

  • As a reminder: full “au build” of our current main project: 28 seconds. Idle watcher, single file change (independent of file type): 18 seconds. Same but with change of two files: 34 seconds.
  • With all of the changes outlined below applied:
    • Idle watcher, single .ts file change: 3 seconds
    • Idle watcher, single .html file change: < 1 second
    • Idle watcher, single .less file change: 7 seconds
    • Idle watcher, change of three files (.ts, .html, .less) “simultaneously” (“save all”): 9 seconds

Basically, I applied three or four changes, depending on how you look at it.

Important: When I sat down to work on this, I decided to do my experiments on the build task (build.ts), not to the run task (run.ts). This feature (“build --watch”) also has been requested before (see: https://github.com/aurelia/cli/issues/265), and since we are not using the built-in serve feature at all, it was the logical thing to do for me. The following concepts however fully apply in the same way, no matter where you put the watcher logic, and simply would require to pass some of the information described below from the run task to the build task if you prefer to put it in the run task. I would of course appreciate if that “build --watch” I just created also found it’s way into the official code base 😃.

1. Change the way you use gulp.watch

If you would only want to fix the crashes and keep the incremental build features of TypeScript, you would need to do the following:

  • Obviously, revert the previous “fix” for the crash to re-enable incremental builds (https://github.com/aurelia/cli/commit/2c74cfe45187cc4bb39ecf5d85a26e7fd287c2f8)
  • Change the use of individual gulp.watch calls for each project source to a single gulp.watch that takes an array of globs instead. Since all of the individual watchers did trigger the same process anyway, this does not change any additional behavior. So basically:
let watch = function() {
  // gulp.watch(project.transpiler.source, refresh).on('change', onChange);
  // gulp.watch(project.markupProcessor.source, refresh).on('change', onChange);
  // gulp.watch(project.cssProcessor.source, refresh).on('change', onChange);

  gulp.watch(
    [
      project.transpiler.source, 
      project.markupProcessor.source, 
      project.cssProcessor.source
    ], refresh).on('change', onChange);
}

What I would recommend in addition however, is to switch from the built-in gulp.watch feature to the separate gulp-watch package (subtle difference but huge improvement). The built-in feature does not pass on information about individual file changes, which makes further optimization difficult. The changes described in the details that follow rely on this, so if you want to add that benefit, you need to switch to gulp-watch (or a similar package that supports this).

Another small optimization is to turn off content reading. By default, the watcher reads the file contents, but they are thrown away because the actual build process re-reads them using gulp.src. For gulp-watch, there’s official documentation about how to turn content reading off (see options.read). Since both the built-in gulp.watch feature and gulp-watch rely on chokidar, I would suspect that should also work for gulp.watch (although it’s not documented there and I didn’t test that). Sample:

import * as gulpWatch from "gulp-watch";

let watch = function() {
  return gulpWatch(
    [
      project.transpiler.source,
      project.markupProcessor.source,
      project.cssProcessor.source
    ],
    {
      read: false, // performance optimization: do not read actual file contents
      verbose: true
    },
    (vinyl) => {
      // do something with the info, or simply trigger an incremental build across all sources here
    });
});

2. Debounce

The above solution fixes the crashes and improves performance for simple scenarios, but still has the “multiple successive builds” problem when more than one file is changed rapidly (“save all”). So I decided to take a look at debouncing the build. Amazingly, this took the most time to get right. The problem mostly is with finding a solution that integrates nicely with the function composition in combination with the asynchronity of gulp. If you look around you’ll find quite some discussions around this and how it should be done (for example here: https://github.com/gulpjs/gulp/issues/1304). I use the default “debounce” Node package. If you choose a different package, make sure it does the triggering on the trailing edge or at least can be configured/forced to do so.

The proposed solution(s) all had the problem that they could not be applied easily to the Aurelia CLI, mostly because the underlying gulp wrapper of the CLI apparently makes some assumptions about the tasks structure and crashes if these are not satisfied (I mostly had problems with the makeInjectable function in Aurelia CLI’s gulp.js). Since I didn’t want to dig that deep into the code base, I preferred bridging to the gulp world of things with a simple manual solution. It basically looks like that (that’s pseudo-code, the full/real code follows below):

import * as debounce from "debounce";
const debounceWaitTime = 100;

let isBuilding = false;
let refresh = debounce(() => {
  if (isBuilding) {
    return;
  }

  isBuilding = true;
  
  triggerActualBuild()
	.then(() => {
		isBuilding = false;
		if (weHaveAnotherPendingBuildRequest) { refresh(); }
	});  
  
}, debounceWaitTime);

There are probably nicer solutions than that, but like I said I wanted to fix this on the project template level and not dig into the Aurelia CLI package itself.

3. Selective builds

With that in place you get rid of the crashes and prevent rapid changes to multiple files triggering multiple successive builds. Basically, you’ll reach the “9 seconds” per incremental build performance level I mentioned in the beginning. Now, as you can see from these performance numbers above, our LESS task contributes the most to incremental build times, where at the same time we actually do not much of LESS editing in our day-to-day work. With that in mind I decided it would be nice to only trigger those parts of the build process which actually need to be performed, for example only do a TypeScript compile if we actually changed .ts files etc. To achieve this, here is what I did:

  • When gulp-watch triggers, push the changed file’s information into a “poor man’s queue” (an array) so it is preserved until the debounced function actually is triggered
  • In the refresh function, collect the actual file changes and test them against the configured globs of the project sources to determine which tasks need to be executed
  • That array then can also be nicely used to determine whether another build needs to be triggered when the current one finishes (because more file changes may have piled up during the build)

Ok, let’s put the puzzle pieces together and see what we get.

import * as minimatch from "minimatch"; /* used to test paths against globs */
import * as gulp from "gulp";
import * as gulpWatch from "gulp-watch";
import * as debounce from "debounce";
// more imports as required, for Aurelia's sub tasks, project configuration etc...

const debounceWaitTime = 100;
let isBuilding = false;
let pendingRefreshPaths = [];

let watch = () => {
  return gulpWatch(
    [
      project.transpiler.source,
      project.markupProcessor.source,
      project.cssProcessor.source
    ],
    {
      read: false, // performance optimization: do not read actual file contents
      verbose: true
    },
    (vinyl) => {
      if (vinyl.path && vinyl.cwd && vinyl.path.startsWith(vinyl.cwd)) {
        let pathToAdd = vinyl.path.substr(vinyl.cwd.length + 1);
        log(`Watcher: Adding path ${pathToAdd} to pending build changes...`);
        pendingRefreshPaths.push(pathToAdd); 
        refresh();
      }
    });
});

The refresh function itself is the one that’s debounced, and looks like that:

let refresh = debounce(() => {
  if (isBuilding) {
    log("Watcher: A build is already in progress, deferring change detection...");
    return;
  }

  isBuilding = true;

  let paths = pendingRefreshPaths.splice(0);
  let tasks = [];
  
  // dynamically compose tasks, note: extend as needed, for example with copyFiles, linting etc.
  if (paths.find((x) => minimatch(x, project.cssProcessor.source)) {
    log("Watcher: Adding CSS tasks to next build...");
    tasks.push(processCSS);
  }

  if (paths.find((x) => minimatch(x, project.transpiler.source)) {
    log("Watcher: Adding transpile task to next build...");
    tasks.push(transpile);
  }

  if (paths.find((x) => minimatch(x, project.markupProcessor.source)) {
    log("Watcher: Adding markup task to next build...");
    tasks.push(processMarkup);
  }

  if (tasks.length === 0) {
    log("Watcher: No relevant changes found, skipping next build.");
    isBuilding = false;
    return;
  }
  
  let toExecute = gulp.series(
    readProjectConfiguration,
    gulp.parallel(tasks),
    writeBundles,
    (done) => {
      isBuilding = false;
      done();
      if (pendingRefreshPaths.length > 0) {
        log("Watcher: Found more pending changes after finishing build, triggering next one...");
        refresh();
      }
    }
  );

  toExecute();
}, debounceWaitTime);

The remaining build.ts content stays more or less the same, I only added the --watch command line option. Like that:

let processBuildPipeline = gulp.series(
  readProjectConfiguration,
  gulp.parallel(
    transpile,
    processMarkup,
    processCSS,
    copyFiles
  ),
  writeBundles
);

let main;

if (CLIOptions.hasFlag("watch")) {
  main = gulp.series(
    processBuildPipeline,
    watch
  );
} else {
  main = processBuildPipeline;
}

export default main;

function readProjectConfiguration() {
  return build.src(project);
}

function writeBundles() {
  return build.dest();
}

function log(message: string) {
  console.log(message);
}

I hope this is detailed enough so at least some of the improvements can find their way into your code base; providing a full working sample is not that easy as I would have to go over the code base and remove more company specific details, which I also stripped from the above fragments. I would love to see the CLI improve in that direction so others can benefit from these changes too.

Cheers 😃

@MisterGoodcat Thanks for taking a look! Glad it looks okay. I’m not surprised it performs like your implementation, though, because a lot of it pretty much is your implementation. 😃

@jwx If you want to reproduce the original crash, a trivial setup is sufficient. Like:

  • Create a new project. Make sure you have at least one .ts file and one .html file in your src folder
  • Start the watcher
  • Change both the .ts and .html file (trivial change like a white space or similar), but do not save
  • Use the “save all” feature of your editor to invoke both internal gulp watchers in a short time frame
  • Watch the crash

With that, I could reproduce the issue 100% of the time. Meaning, if you are not able to crash the watcher that way, you have fixed the issue 😃.

For performance testing, I’m willing to help, but you would have to provide the template, improved build tasks or some other instructions to me, as the projects in question are non-public.

@gama410 You are right that the crash is currently fixed; however the more recent parts of the discussion focused on an alternative solution that not only fixes the crash but also improves performance of the watcher tremendously.

@MisterGoodcat Glad to hear! Thanks for your work on this. I will pick this up, I just need to finish work on a couple of other issues first

Quick update: one of my teams used these improvements during the week, and confirms the watcher is rock solid now. It even survives switching large branches, when dozens of files change simultaneously. We can also confirm the above performance numbers accurately reflect day-to-day use. I’m glad I invested that time; it really improved working on the code base a lot.

I’m getting this a LOT. Every few edits. Does everyone here have multiple bundles? Does this happen when you’re editing files in 2 diff bundles and they get saved at the same time?