pulumi: Pulumi became so slow (~7x times) since version 2.16.2
Expected Behavior
Smooth & Fast runtime experience.
Current Behavior
Lately we’ve noticed pulumi up was taking longer than usual.
Steps to Reproduce
curl -fsSL https://get.pulumi.com | sh -s -- --version 2.16.2(or any version above)pulumi up
Context (Environment)
I tracked down the issue and tried every single release. Here is the brief output of different versions and their runtime information.
2.15.6
$ time pulumi up --suppress-outputs
Previewing update (prod):
Type Name Plan
pulumi:pulumi:Stack prod
Resources:
236 unchanged
Permalink: https://mybucket.s3.amazonaws.com/....
Do you want to perform this update? yes
Updating (prod):
Type Name Status
pulumi:pulumi:Stack prod
Resources:
236 unchanged
Duration: 48s
Permalink: https://mybucket.s3.amazonaws.com/....
warning: A new version of Pulumi is available. To upgrade from version '2.15.6' to '2.18.2', run
$ curl -sSL https://get.pulumi.com | sh
or visit https://pulumi.com/docs/reference/install/ for manual instructions and release notes.
real 1m4.206s
user 0m8.295s
sys 0m1.480s
2.16.1
$ time pulumi up --suppress-outputs
Previewing update (prod):
Type Name Plan
pulumi:pulumi:Stack prod
Resources:
236 unchanged
Permalink: https://mybucket.s3.amazonaws.com/....
Do you want to perform this update? yes
Updating (prod):
Type Name Status
pulumi:pulumi:Stack prod
Resources:
236 unchanged
Duration: 49s
Permalink: https://mybucket.s3.amazonaws.com/....
warning: A new version of Pulumi is available. To upgrade from version '2.16.1' to '2.18.2', run
$ curl -sSL https://get.pulumi.com | sh
or visit https://pulumi.com/docs/reference/install/ for manual instructions and release notes.
real 1m8.144s
user 0m8.883s
sys 0m1.951s
2.16.2
$ time pulumi up --suppress-outputs
Previewing update (prod):
Type Name Plan
pulumi:pulumi:Stack prod
Resources:
236 unchanged
Permalink: https://mybucket.s3.amazonaws.com/....
Do you want to perform this update? yes
Updating (prod):
Type Name Status
pulumi:pulumi:Stack prod
Resources:
236 unchanged
Duration: 5m45s
Permalink: https://mybucket.s3.amazonaws.com/....
warning: A new version of Pulumi is available. To upgrade from version '2.16.2' to '2.18.2', run
$ curl -sSL https://get.pulumi.com | sh
or visit https://pulumi.com/docs/reference/install/ for manual instructions and release notes.
real 6m4.890s
user 0m34.030s
sys 0m4.367s
2.18.2
$ time pulumi up --suppress-outputs
Previewing update (prod):
Type Name Plan
pulumi:pulumi:Stack prod
Resources:
236 unchanged
Permalink: https://mybucket.s3.amazonaws.com/....
Do you want to perform this update? yes
Updating (prod):
Type Name Status
pulumi:pulumi:Stack prod
Resources:
236 unchanged
Duration: 5m58s
Permalink: https://mybucket.s3.amazonaws.com/....
real 6m16.338s
user 0m28.163s
sys 0m2.973s
Additional info
$ pulumi stack export | wc -c
2633995
- Backend: s3
- OS: Ubuntu 20.04.1 LTS (WSL2)
- Language: TypeScript
- Cloud Provider: AWS
v2.15.6 Profile files:
2156.14400.cpu.zip 2156.14400.mem.zip
v2.16.2 Profile files:
2162.11978.cpu.zip 2162.11978.mem.zip
v2.18.2 Profile files:
About this issue
- Original URL
- State: closed
- Created 3 years ago
- Reactions: 7
- Comments: 21 (11 by maintainers)
Hi folks, pulumi is awesome, but it’s pretty much unusable for anything beyond toy projects - on a medium-sized project with ~200 resources I have to wait 10-15 minutes for each up, and it’s getting worse every release 😦
I have collected a profile and made a trace, but I fear that traces might contain private data - I can share per e-mail if required. I understood that CPU / memory profiles do not contain private data, so I’m uploading them to this ticket.
moneymeets.13305.cpu.gz moneymeets.13305.mem.gz
/cc @marns93
Hi @lukehoban - awesome, I’ve just tested again with the same project I shared the logs of, and…
1m43s vs 15mXXs makes a huge difference for us! Thank you very much for taking care of it so quickly 👍
I do think that is part of why we haven’t seen more reports of this - even though the issue would affect most users to some degree.
But when I said I couldn’t repro it, I meant I couldn’t even trigger a case where mustWrite returned true. That said - I am fairly confident that the fix in the linked PR will address this.
We have a similar regression after 2.15.6.
2.15.6 - ~13m 2.16.2 - ~35m
Let me know if you would like us to profile as well.