<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0"><channel><title><![CDATA[Reflame blog]]></title><description><![CDATA[[Reflame](https://reflame.app) deploys React web apps in milliseconds, to previews and to production.
Never wait for a deploy ever again!]]></description><link>https://blog.reflame.app</link><generator>RSS for Node</generator><lastBuildDate>Sun, 12 Apr 2026 07:09:18 GMT</lastBuildDate><atom:link href="https://blog.reflame.app/rss.xml" rel="self" type="application/rss+xml"/><language><![CDATA[en]]></language><ttl>60</ttl><item><title><![CDATA[Reflame now deploys your NPM package updates faster than a node_modules cache restore]]></title><description><![CDATA[My first order of business with Reflame is to make waiting for deployments a relic of the past.
Ever since the very first MVP, Reflame has been able to meet this bar for most typical deploys users make on a day-to-day basis, usually landing somewhere...]]></description><link>https://blog.reflame.app/reflame-now-deploys-your-npm-package-updates-faster-than-a-nodemodules-cache-restore</link><guid isPermaLink="true">https://blog.reflame.app/reflame-now-deploys-your-npm-package-updates-faster-than-a-nodemodules-cache-restore</guid><category><![CDATA[React]]></category><category><![CDATA[npm]]></category><category><![CDATA[node_modules]]></category><category><![CDATA[deployment]]></category><category><![CDATA[performance]]></category><dc:creator><![CDATA[Lewis Liu]]></dc:creator><pubDate>Mon, 05 Dec 2022 02:18:19 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1670120151963/0105680d-e46e-4516-88a4-6d3e40488266.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>My first order of business with Reflame is to make waiting for deployments a relic of the past.</p>
<p>Ever since the very first MVP, Reflame has been able to meet this bar for most typical deploys users make on a day-to-day basis, usually landing somewhere between:</p>
<ul>
<li><p>100-400ms when deploying from our VSCode extension</p>
</li>
<li><p>500-2000ms when deploying from our GitHub app</p>
</li>
</ul>
<blockquote>
<p>If you're used to a traditional deployment service that takes <a target="_blank" href="https://blog.reflame.app/i-compared-deploy-speeds-for-reflame-vercel-netlify-cloudflare-pages-on-the-same-repo">at minimum 10s of seconds to deploy even the smallest of changes on the tiniest of apps</a> and only gets worse over time, your first reaction to the claims above is probably "I'll believe it when I see it".</p>
<p>For you, dear discerning reader, I have a demo:</p>
<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://www.youtube.com/watch?v=SohUnrjiIxk">https://www.youtube.com/watch?v=SohUnrjiIxk</a></div>
<p> </p>
<p>Still skeptical? Reflame is completely free for solo hobby projects, so take it for a spin for yourself by signing up at <a target="_blank" href="https://reflame.app">https://reflame.app</a>.</p>
</blockquote>
<p>Talking just about typical deploys doesn't paint the full picture, however.</p>
<p>There were two notable edge cases I've always been acutely aware of in Reflame that were distinctly <em>not</em> free of waiting.</p>
<p>Over the past few weeks, I've made substantial improvements to both of these edge cases, but today's post will be focusing on the one that will have a much more noticeable effect on the day-to-day quality of life for most Reflame users:</p>
<p>Updating NPM packages.</p>
<blockquote>
<p>Some quick spoilers on the other edge case since I can't help myself:</p>
<p>Deploying a ton of brand new modules at the same time also used to scale very poorly.</p>
<p>The compute efficiency of <a target="_blank" href="https://babeljs.io/">Babel</a> left much to be desired, heavily bottlenecking us to the point of almost linear latency scaling with the number of modules being deployed. I.e. if deploying a single 500ish-lines TSX module took 300ms, deploying 100 modules is likely to take somewhere in the ballpark of 20 seconds.</p>
<p>Because Reflame is really good at never doing the same piece of work twice, this ended up being an extremely rare edge case in practice, basically only limited to:</p>
<ul>
<li><p>When importing an existing repo as a new Reflame app. All/most modules in the repo are new to Reflame, so work needed to be done for every one.</p>
</li>
<li><p>When making large-scale refactors through codebase-wide search &amp; replace, or third-party refactoring tools. All of these changed modules are new to Reflame as well, so work needed to be done for each.</p>
</li>
<li><p>When we need to invalidate Reflame's transform caches due to changes in our transforms/caching logic, which is still happening a bit more frequently than I'd like. Every time we do this, new work will need to be done for every module in every app.</p>
</li>
</ul>
<p>I ended up switching to a custom fork of <a target="_blank" href="https://swc.rs/">SWC</a> to reduce the wall-clock-time of compute required for transforming a 500ish-line module from ~100ms to ~10ms, drastically improving the scaling characteristics for deploying tons of modules (most of the wall-clock time is now dominated by network I/O which is a lot easier to parallelize than compute).</p>
<p>Not only that, but doing this also chopped off a good ~90ms from typical single module deploys, making the typical deploys in Reflame even faster!</p>
<p>In the hypothetical example earlier, deploying that single 500ish-line TSX module would now take closer to 210ms, and deploying 100 similar modules would likely take somewhere in the ballpark of 5 seconds!</p>
<p>I'm saving the juicy technical details and benchmarks for a future blog post. No promises on when though.</p>
</blockquote>
<h2 id="heading-the-past">The past</h2>
<p>Updating NPM packages in Reflame tended to scale in latency with the total size and number of all the NPM packages in the app.</p>
<p>To give you a better sense of the scaling characteristics here:</p>
<ul>
<li><p>It could take about 20s to bump the version of a single package in <a target="_blank" href="https://reflame.app">reflame.app</a>, which used about 16 packages at the time (direct dependencies only, not counting transitive dependencies).</p>
</li>
<li><p>Contrast this to about 5s to do the same in a newly-created app with just 1 package.</p>
</li>
</ul>
<p>We cached sets of NPM packages so you never have to wait for the deployment of the same set of packages twice, which is good.</p>
<p>However, the "set of NPM packages" was the only granularity we cached at, meaning any change in an individual package within that set (say, a version bump) will invalidate the cache and require entirely new work to be done for that new set. Not so good.</p>
<p>This approach compares somewhat poorly even to traditional CLI-tooling-based deployment platforms if we consider it <em>in isolation</em>, since caching and restoring directories like node_modules (a very commonplace optimization) can make small updates to large sets of NPM packages much faster than doing a full install of the entire package set.</p>
<p>So why didn't Reflame go with the same approach?</p>
<p>Eagle-eyed readers will have noticed the italics around <em>in isolation</em> when I contrasted the approaches earlier.</p>
<p>While the node_modules directory caching approach makes the task of updating specific NPM packages fast <em>in isolation</em>, it's important to take into account the end-to-end latency contributed by the very act of saving and restoring the entire node_modules directory on every deploy.</p>
<p>We've all had a nice chuckle at this meme before:</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1670117600300/20593fd9-f72e-4df4-ae76-4e029bcd2854.png" alt="Heaviest objects in the universe, presented in terms of how they warp spacetime: Sun (small dip in spacetime fabric), neutron star (deep hole), black hole (much deeper hole), node_modules (even deeper hole with its ends cut off from view).)" /></p>
<p>To anyone who's ever tried to make package installs fast, the size of node_modules is no joke. I'm sure most of us can find a couple of node_modules folders with sizes in the low single-digit gigabytes if not worse.</p>
<p>This means just the very act of caching and restoring the node_modules folder itself can take 10s of seconds as we add more and more dependencies, before we even begin to work on actual updates.</p>
<p>This eventually imposes a latency floor of 10s of seconds on every single deploy, even ones where only a tiny handful of app source modules are changed, and <em>no NPM packages are even updated</em> (by far the most common deployment scenario), just to make the occasional NPM package update faster. Not a great tradeoff if you ask me.</p>
<p>Everybody takes latency floors like this for granted today, but Reflame needed to do better if we wanted any chance of achieving our goal of 0-wait deployments, even just for the typical use case of updating only a tiny handful of app modules.</p>
<p>So, the rather naive solution I came up with in the Reflame MVP had the fairly obvious downside of doing a lot more work than necessary when a new set of packages is encountered, even if that new set is almost precisely identical to a previously seen set, differing in just a single version number of a single package.</p>
<p>But it was simple and quick to implement (important because I had already spent more than half a year burning through my savings to work on the rest of Reflame full time at that point), and could preserve the most important property of the deployment system I was trying to build: decoupling the speed of typical deploys (i.e. ones that don't update NPM packages) from the size of your NPM packages.</p>
<p>The idea was, in exchange for having to wait a bit longer when updating NPM packages, you don't have to wait at all when deploying anything that doesn't involve a change in NPM packages.</p>
<p>That was a tradeoff I was much more willing to accept in the short term, and judging from the absence of complaints from early users about having to wait for NPM package updates, thankfully it looks like I made the right call there.</p>
<p>But clearly, we still had a lot of room for improvement here, and anything that even has the potential to scale to 10s of seconds in latency deserves a place on our roadmap if I have anything to say about it.</p>
<h1 id="heading-the-present">The present</h1>
<p>As of the time of this tweet:</p>
<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://twitter.com/lewisl9029/status/1593325423525392384">https://twitter.com/lewisl9029/status/1593325423525392384</a></div>
<p> </p>
<p>I'm proud to announce that this optimization has been knocked off of our roadmap and into production.</p>
<p>Now, the latency in updating NPM packages only scales with the size of the packages (and downstream transitive dependencies) actually being updated, instead of with the size of the entire set of packages used by the app.</p>
<p>In practice, this means bumping versions of individual packages will generally take somewhere in the ballpark of 1-5s, depending on the sizes of the packages being updated. Check out the demo in the Tweet to see it in action.</p>
<p>This is much faster than even the minimum latency floors on other deployment services <a target="_blank" href="https://blog.reflame.app/i-compared-deploy-speeds-for-reflame-vercel-netlify-cloudflare-pages-on-the-same-repo">under very ideal circumstances</a>.</p>
<p>We did this by making our caching logic much more granular. It now operates on the level of individual package versions, instead of on the level of entire package sets.</p>
<p>This means when you bump a package version, Reflame only needs to do work for the new version of the package you bumped to (and any new versions of its new transitive dependencies), instead of for the entire set of packages specified in your config.</p>
<p>A cool bonus of the new granular caching system is that if you revert to a previously used package version, that deploy will take the fast path using the cache metadata and complete in under a second flat, since work has already been done for that version, even if you have updated other packages since.</p>
<p>An even cooler bonus, coming from the fact that the NPM package cache is shared across all Reflame users, is when users make use of any package that other users have used before (e.g. React, which is used by every Reflame app by default), Reflame will take advantage of the shared cache to skip work and make your deploy even faster, even if you have personally never used the package before.</p>
<p>Lastly, this was true before as well, but it's useful to mention that if you use our VSCode extension, preview deploys using our GitHub app will usually take the super-fast path regardless of what changes you make, because those changes will likely already have been deployed by our VSCode extension by the time you push them up (with this change, even most NPM package updates complete faster than the typical git push, let alone source module updates that usually complete in under 400ms).</p>
<h1 id="heading-the-future">The future</h1>
<p>So, NPM package updates in Reflame today is now faster than in any other deployment system out there. That doesn't mean we're done with improving it though!</p>
<p>For one, the shared package cache is designed to take advantage of the built-in network effects of Reflame as a SaaS by making package updates faster for everybody the more users use it. Even if we don't make any changes from our side, package updates will get faster for everybody over time as more packages are requested on it.</p>
<p>You can also imagine how we could help it out by listening to the firehose of updates on the NPM package registry and preprocess every new module the moment it gets published. This would lead to effectively 0 cache misses for new packages, meaning <em>every</em> package update will be ridiculously fast, all the time, for everybody (at least until we have to evict the least often used packages from our cache).</p>
<p>This requires throwing more money at the problem than I can justify at the moment, but is something I'm seriously planning to explore down the line. Exciting times ahead!</p>
]]></content:encoded></item><item><title><![CDATA[I compared deploy speeds for Reflame, Vercel, Netlify, Cloudflare Pages on the same repo]]></title><description><![CDATA[Update Nov 12, 2022
Matt Kane from Netlify replied to my Twitter thread explaining that:
1) Part of the issue here is my Netlify account had branch builds enabled, which resulted in two builds running for every commit. This resulted in longer queue t...]]></description><link>https://blog.reflame.app/i-compared-deploy-speeds-for-reflame-vercel-netlify-cloudflare-pages-on-the-same-repo</link><guid isPermaLink="true">https://blog.reflame.app/i-compared-deploy-speeds-for-reflame-vercel-netlify-cloudflare-pages-on-the-same-repo</guid><category><![CDATA[vite]]></category><category><![CDATA[React]]></category><category><![CDATA[Vercel]]></category><category><![CDATA[Netlify]]></category><category><![CDATA[cloudflare]]></category><dc:creator><![CDATA[Lewis Liu]]></dc:creator><pubDate>Sat, 12 Nov 2022 20:33:21 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1668474036384/9fW5xF16l.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h2 id="heading-update-nov-12-2022">Update Nov 12, 2022</h2>
<p><a target="_blank" href="https://twitter.com/ascorbic/status/1591728244323389440">Matt Kane from Netlify replied</a> to my Twitter thread explaining that:</p>
<p>1) Part of the issue here is my Netlify account had branch builds enabled, which resulted in two builds running for every commit. This resulted in longer queue times with a concurrency level of 1, which would explain about ~20s of the queuing time seen here to run a branch build of similar length. </p>
<p>2) The other part is Netlify's previews run on PR creation, not on every commit like on all the other services tested here. This explains another ~5s of delay between creating the branch and opening the PR.</p>
<p>That unfortunately still leaves ~30s of the total ~55s of queuing time above unaccounted for. Initial tests with branch builds turned off show results that are much more competitive, but it was performed on 4AM on a Sunday, so I can't conclusively rule out additional queuing delays during busier hours yet. Will be investigating further during PST business hours. Stay tuned for another update!</p>
<h2 id="heading-update-nov-14-2022">Update Nov 14, 2022</h2>
<blockquote>
<p>Shorter Twitter thread version of this update if you're busy: https://twitter.com/lewisl9029/status/1592341968780001280</p>
</blockquote>
<p>I reran the original test with Netlify branch builds disabled at 2:20PM PST on a Monday, to investigate queuing behavior during times of higher traffic. </p>
<p>Here's the recording:</p>
<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://www.youtube.com/watch?v=jrPKIzQw4RU">https://www.youtube.com/watch?v=jrPKIzQw4RU</a></div>
<p>Here's an updated chart based on these new timestamps:</p>
<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://codesandbox.io/embed/preview-deployment-timeline-updated-qx71j6?fontsize=14&amp;hidenavigation=1&amp;theme=dark&amp;view=preview">https://codesandbox.io/embed/preview-deployment-timeline-updated-qx71j6?fontsize=14&amp;hidenavigation=1&amp;theme=dark&amp;view=preview</a></div>
<p>We can see that Netlify is now much more competitive without the double builds from enabling branch deploys, finishing 2nd place, beating Cloudflare Pages by ~8s. </p>
<p>However, queuing time at ~10s is still measurably higher compared to the other services tested, and contributed the majority of the ~17s difference between Netlify and 2nd place Vercel. Even if we remove the ~3s head start others enjoyed due to starting at commit instead of at PR creation, it would still end up at ~14s behind.</p>
<p>This queuing time at ~10s with a single PR preview deploy also gives me a much stronger hypothesis for the composition of our earlier result of ~55s queuing time with the double build from enabling branch deploys:</p>
<ul>
<li>~10s queue time for initial branch deploy</li>
<li>~15-20s build time for initial branch deploy</li>
<li>~10s queue time for subsequent PR deploy</li>
<li>~15-20s build time for subsequent PR deploy</li>
</ul>
<p>This adds up to 50s-60s when performed sequentially with a concurrency level of 1, which matches perfectly with what we previously observed.</p>
<p>I've updated the subtitle to something a bit less inflammatory in an attempt to reflect the fact that the queuing time for most users is likely not going to be as egregious as I previously experienced. For full transparency, this is what it was previously:</p>
<blockquote>
<p>Biggest takeaway: Netlify might be lying to you about their deploy speed</p>
</blockquote>
<p>Here's what I updated it to:</p>
<blockquote>
<p>Biggest takeaway: Reported deploy speeds might not be painting the full picture</p>
</blockquote>
<h3 id="heading-feedback-for-netlify">Feedback for Netlify</h3>
<p>With all that said, here are some feedback for Netlify based on my experience comparing all of these deployment services thus far:</p>
<h4 id="heading-enabling-branch-deploys-shouldnt-result-in-double-builds-on-branches-with-prs">Enabling branch deploys shouldn't result in double builds on branches with PRs</h4>
<p>The branch deploys option currently significantly increases queue times and costs for PR deploys. A gigantic footgun, and completely counterintuitive to how I expected the option to work, and I suspect I'm not the only one who casually enabled it without realizing its negative effects.</p>
<p>Every other deployment service tested deploys every commit on every branch by default, and just reuses those commit deploys for PR previews, with 0 opportunity for double builds.</p>
<p>I don't necessarily expect Netlify to change their current defaults since there is tradeoff in costs involved with building all commits vs only those on branches with associated PRs, and it's fully up to them to decide which side of that tradeoff would benefit the most users.</p>
<p>I would however like to see an option to enable the deploy every commit on every branch behavior, because it can result in significantly faster deploy speeds for newly pushed branches, sometimes even making previews ready by the time we create the PR. </p>
<p>Branch deploys could have been this option, but the double builds on PRs issue makes it unsuitable for any use case outside of maintaining long-lived non-default branch previews (for staging, testing, etc). </p>
<p>Would be great to see this addressed, either by updating the behavior of the existing option, or by introducing a new option if there's a need to preserve the existing behavior (I've been failing to think of any scenarios where the current behavior would be preferable, but that could be just a lack of imagination on my part).</p>
<h4 id="heading-queuing-times-during-high-traffic-hours-needs-improvement">Queuing times during high traffic hours needs improvement</h4>
<p>As we've seen here, Netlify's actual build speeds are actually within spitting distance of Vercel's, but significant time spent queuing makes it uncompetitive from the perspective of a real-world user.</p>
<p>From my limited testing so far, queue times seem to be much better during hours with low traffic (last tested ~4AM PST on a Sunday), so I'd wager this is a capacity problem that Netlify may be able to throw more money at to improve. But I don't have the full picture here, so maybe it's not that simple.</p>
<h4 id="heading-make-queuing-times-more-transparent-and-readily-accessible">Make queuing times more transparent and readily accessible</h4>
<p>Ideally, every deployment service should do this in the interest of transparency, but during my testing, Netlify is the only one I've found to suffer from queuing times to a significant enough degree to sorely need it. For everything else, queuing time has been a rounding error compared to total deploy duration.</p>
<p>One could argue that if Netlify improved queuing times enough to match other services tested, then this could become unnecessary. Will leave it up to them to decide which to tackle first.</p>
<blockquote>
<p>There's even a profit argument in favor of providing better insight into queue times for providers like Netlify that sell higher concurrency limits (at least if your queue times for builds you have spare concurrency for are competitive). I'm sure this would come in handy for capacity planning, i.e. deciding how much concurrency to buy to make the best tradeoff between costs and team productivity.</p>
<p>That said, this is only a concern for everything tested here <em>besides Reflame</em>, because we don't artificially limit concurrency based on how much you pay us. We can afford to do this precisely because we've built the tech to only have to dedicate milliseconds of compute to each deploy as opposed to 10s of seconds, so we can run these deploys on a small handful of simple multitenant web servers instead of an elaborate compute cluster with tons of scheduling, queueing, and startup overhead, and squeeze 100x more deploys on the same hardware compared to every other service.</p>
<p>This 100x better tech, combined with simple, flat monthly price per user pricing with no BS is what ensures Reflame's incentives are aligned with the interests of our customers. We are always incentivized to make deploys faster for you, so we can pocket more revenue as profit.</p>
</blockquote>
<p>That concludes all of the updates I had planned. What follows is the original article, please read it with a generous helping of salt considering the results for Netlify were heavily skewed by the double build problem.</p>
<hr />
<blockquote>
<p>I also made an abridged version of this post as a <a target="_blank" href="https://twitter.com/lewisl9029/status/1591530406909997056">Twitter thread</a>. Check it out if you just want an executive summary.</p>
<p>HN discussions here: https://news.ycombinator.com/item?id=33576753</p>
</blockquote>
<h2 id="heading-the-motivation">The motivation</h2>
<p>It's been about a month since <a target="_blank" href="https://blog.reflame.app/reflames-launch-was-a-great-success">Reflame's launch</a> on <a target="_blank" href="https://news.ycombinator.com/item?id=33134059">Show HN</a>. Before the launch, all we had on Reflame were small hobby projects (and https://reflame.app itself, which was still on the small side, on the order of hundreds of modules). These days, I'm starting to see larger and larger projects pop up, which is exciting! But scary at the same time. </p>
<p>While Reflame's big-picture architecture is designed to be able to deploy updates with millisecond latencies for projects of arbitrary size, it's missing a bunch of important micro-optimizations that can compound catastrophically for a large enough project (think too much heavy compute running on a single node, running out of memory/temp disk space, bumping into third-party rate limits, etc). This is one of the many things that has very literally been keeping me awake at night.</p>
<p>So, I've been working on this since the <a target="_blank" href="https://blog.reflame.app/new-in-reflame-create-apps-in-1-click">previous update</a>.</p>
<h2 id="heading-the-setup">The setup</h2>
<p>To start with, I've set up a new <a target="_blank" href="https://github.com/reflame/deployment-benchmark">deployment-benchmark repo</a> using our shiny new 1-click app creation feature:</p>
<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://youtu.be/yI8TLU6Zy18">https://youtu.be/yI8TLU6Zy18</a></div>
<blockquote>
<p>A brand new repo with React, Vite, and Reflame fully set up and deployed in under 30 seconds! AFAIK this is the fastest way to create and deploy a production-ready Vite React repo on the internet. <a target="_blank" href="https://reflame.app/dashboard/create-app?foundationName=vite-react">Give it a try yourself</a> the next time you're starting a new project! </p>
<p>The repo works out of the box with both Vercel and Netlify, so all you need to do is to add the repo through their dashboards to start deploying with those simultaneously, if you want to do your own comparisons before deciding. Cloudflare Pages requires slightly more setup, more on this <a class="post-section-overview" href="#heading-cloudflare-pages">later</a>.</p>
</blockquote>
<p>There's a lot more benchmarking and optimizations I still have to do over the next few weeks and months, but before moving on to more elaborate setups, I thought it would be useful to first take some time to establish a baseline deployment speed for this repo while it's still a tiny, near-stock Vite React app with &lt;10 source files total.</p>
<p>To keep myself honest about which scenarios Reflame shines and where it falls short, I've also connected a few other deployment services for client-rendered React apps that folks might already be using. To start with, I've added Vercel, Netlify, and Cloudflare Pages. </p>
<p>Let me know in the comments if there are others you'd like to see! Anything that can deploy a stock Vite React app should be fair game.</p>
<h2 id="heading-the-results">The results</h2>
<p>Without further ado, here's an unedited screen recording of me creating a new PR and waiting for all the preview deploys to finish:</p>
<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://www.youtube.com/watch?v=TXQx64SlQI8">https://www.youtube.com/watch?v=TXQx64SlQI8</a></div>
<p>If you're patient enough to want to watch the whole thing, I highly recommend watching at 2x speed. It'll take a while. </p>
<p>Alternatively, keep scrolling for a pretty chart, and some commentary on the results.</p>
<blockquote>
<p>I realize this is not very scientific since we're only looking at a single run, but I've repeated this dozens of times over the past few weeks and have not seen any variations significant enough to materially affect the commentary here. Over the next few weeks/months I plan on evolving the repo to conduct more sophisticated and statistically rigorous benchmarks on an ongoing basis to use as health checks and detect performance regressions. Keep an eye out for updates on this blog!</p>
<p>I'd also like to invite anyone curious enough to try reproducing the results here themselves by creating a new Vite React repo and connecting all of these deployment services to it. Shameless plug: easiest way to create a repo for this purpose is through Reflame's <a target="_blank" href="https://reflame.app/dashboard/create-app?foundationName=vite-react">1-click app creation</a> feature. The entire process shouldn't take longer than 30 minutes if you follow this route.</p>
</blockquote>
<h2 id="heading-the-details">The details</h2>
<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://codesandbox.io/embed/preview-deployment-timeline-bksx2p?fontsize=14&amp;hidenavigation=1&amp;theme=dark&amp;view=preview">https://codesandbox.io/embed/preview-deployment-timeline-bksx2p?fontsize=14&amp;hidenavigation=1&amp;theme=dark&amp;view=preview</a></div>
<p>I made the chart above using rough timestamps in the video to help visualize how long each deploy ends up taking, and the various components involved.</p>
<h3 id="heading-reflame">🥇 Reflame</h3>
<p>The results for Reflame here are not all that interesting, because it finished well before I could even finish creating the PR to see status checks. The self-reported latency is 1.071s, but it's impossible to measure in any further granularity from the video alone.</p>
<p>That said, what we're looking to measure with this exercise is the latency experienced by a real world user, so this result is fast enough to be considered <em>instant</em> for that purpose (I have a lot more to say about this <a class="post-section-overview" href="#heading-parting-words">at the end</a>). This shouldn't come as a surprise to anyone who's been following along. This is Reflame's bread and butter, after all.</p>
<p>Results for the other services, however, are actually a lot more interesting than I expected them to be. Let's go through each, ordered by their ranking.</p>
<h3 id="heading-vercel">🥈 Vercel</h3>
<p>Vercel came in second place, at an end to end deployment latency of about 16 seconds from when the branch was created. </p>
<p>This is actually a very respectable showing considering Vercel is just cloning the repo and running Vite in a container/VM. This is exactly what Netlify and Cloudflare Pages does as well, so in theory there isn’t actually <em>that</em> much room for differentiation between these three services. The fact that there was such a dramatic difference between them in practice came at a huge surprise to me, and I'll be diving into the major contributors in the <a class="post-section-overview" href="#heading-cloudflare-pages">later</a> <a class="post-section-overview" href="#heading-netlify">sections</a>.</p>
<blockquote>
<p>Even though we're deploying a Vite app here, Reflame does not actually run Vite anywhere in its deployment pipeline, unlike everything else being compared here. Actually running Vite requires spinning up a container/VM, running <code>git clone</code>, <code>npm install</code>, <code>vite build</code>, and then finally deploying the result. Each of these operations have latency floors on the order of whole digit seconds, which would make the kinds of latencies Reflame targets completely impossible.</p>
<p>Instead, Reflame fetches only the modules that have actually changed, performs a minimal transform on them to get rid of JSX and TypeScript syntax, and deploys them as independent ES modules. We then sprinkle some custom dependency analysis and aggressive prefetching on top to flatten the module loading waterfall, in order to keep initial load performance reasonable while optimizing for minimal write-amplification (key to tapering the normally linear scaling of deployment speed with codebase size) and maximum cache granularity.</p>
<p>Cache granularity in particular is arguably more important than initial load performance for the kinds of apps Reflame is intended for (think product dashboards with no publicly indexible content that people repeatedly log into), yet it has been largely ignored by the industry in favor of chasing higher Lighthouse scores, which is optimized for measuring performance for initial visits. I'll have more to say on this topic in a later post.</p>
</blockquote>
<p>Before I move on though, I wanted to note that, if you take a look at the screen recording, you can see I actually didn't realize the deploy was finished until much later, because of a bug with the GitHub UI failing to update the status check in time. We can see the comment at the top of the screen update <a target="_blank" href="https://youtu.be/TXQx64SlQI8?t=30">at around the 30 second mark</a>, but I was tunnel-visioning hard on the status checks and didn't realize until about 10 seconds later.</p>
<p>Chances are, we've all probably seen this happen on GitHub before. Vercel is obviously not at fault here, so I used the timestamp of the comment appearing to calculate deployment time for the purpose of this exercise. However, when this happens in the real world, it invariably results in a crappy user experience, regardless of who's at fault. </p>
<p>This can happen when creating a new PR for any deployment service on GitHub, except Reflame, because Reflame deploys are practically guaranteed to always finish deploying before any human can finish creating a PR from their branch (in fact, I wouldn't be surprised if the GitHub API calls involved here alone would take longer than a Reflame deploy, even before taking the human bottleneck into account). Reflame status checks on PR creation will be complete on the initial page load, so isn't ever going to be affected by flakey WebSockets updates from GitHub.</p>
<blockquote>
<p>Fun fact: I actually had to stop sending the in-progress status check entirely for Reflame. The in-progress and completed status checks were always sent in such quick succession that it sometimes resulted in races that caused the status check to never be marked as complete. Luckily, Reflame deploys so fast that users would practically never see an in-progress status check even if we sent it anyways, so it was fairly safe to omit.</p>
</blockquote>
<h3 id="heading-cloudflare-pages">🥉 Cloudflare Pages</h3>
<p>Third place was Cloudflare Pages, finishing in about 37 seconds, or about 21 seconds after Vercel.</p>
<p>A quick look into the build logs between Vercel and CF Pages presents some clues as to where the differences might lie:</p>
<p>Here's CF Pages: </p>
<pre><code><span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">17.630287</span>Z    Cloning repository...
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">18.520063</span>Z    From https:<span class="hljs-comment">//github.com/reflame/deployment-benchmark</span>
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">18.520692</span>Z     * branch            <span class="hljs-number">733417</span>b754cefeff2ff56d88d35ae9fe9f36fb8d -&gt; FETCH_HEAD
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">18.520901</span>Z    
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">18.56086</span>Z    HEAD is now at <span class="hljs-number">733417</span>b Update App.jsx
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">18.5614</span>Z    
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">18.698318</span>Z    
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">18.724411</span>Z    Success: Finished cloning repository files
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">21.814043</span>Z    Installing dependencies
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">21.824443</span>Z    Python version set to <span class="hljs-number">2.7</span>
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">24.931283</span>Z    Downloading and installing node v16<span class="hljs-number">.18</span><span class="hljs-number">.0</span>...
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">25.339846</span>Z    Downloading https:<span class="hljs-comment">//nodejs.org/dist/v16.18.0/node-v16.18.0-linux-x64.tar.xz...</span>
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">25.796972</span>Z    Computing checksum <span class="hljs-keyword">with</span> sha256sum
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">25.929794</span>Z    Checksums matched!
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">30.286701</span>Z    Now using node v16<span class="hljs-number">.18</span><span class="hljs-number">.0</span> (npm v8<span class="hljs-number">.19</span><span class="hljs-number">.2</span>)
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">30.694917</span>Z    Started restoring cached build plugins
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">30.707332</span>Z    Finished restoring cached build plugins
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">31.215194</span>Z    Attempting ruby version <span class="hljs-number">2.7</span><span class="hljs-number">.1</span>, read <span class="hljs-keyword">from</span> environment
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">34.746392</span>Z    Using ruby version <span class="hljs-number">2.7</span><span class="hljs-number">.1</span>
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">35.099867</span>Z    Using PHP version <span class="hljs-number">5.6</span>
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">35.251971</span>Z    <span class="hljs-number">5.2</span> is already installed.
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">35.281233</span>Z    Using Swift version <span class="hljs-number">5.2</span>
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">35.281962</span>Z    Started restoring cached node modules
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">35.297622</span>Z    Finished restoring cached node modules
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">35.807495</span>Z    Installing NPM modules using NPM version <span class="hljs-number">8.19</span><span class="hljs-number">.2</span>
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">36.217549</span>Z    npm WARN config tmp This setting is no longer used.  npm stores temporary files <span class="hljs-keyword">in</span> a special
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">36.217889</span>Z    npm WARN config location <span class="hljs-keyword">in</span> the cache, and they are managed by
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">36.218149</span>Z    npm WARN config     [<span class="hljs-string">`cacache`</span>](http:<span class="hljs-comment">//npm.im/cacache).</span>
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">36.618472</span>Z    npm WARN config tmp This setting is no longer used.  npm stores temporary files <span class="hljs-keyword">in</span> a special
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">36.619103</span>Z    npm WARN config location <span class="hljs-keyword">in</span> the cache, and they are managed by
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">36.619309</span>Z    npm WARN config     [<span class="hljs-string">`cacache`</span>](http:<span class="hljs-comment">//npm.im/cacache).</span>
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">38.795358</span>Z    
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">38.795652</span>Z    added <span class="hljs-number">85</span> packages, and audited <span class="hljs-number">86</span> packages <span class="hljs-keyword">in</span> <span class="hljs-number">2</span>s
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">38.795817</span>Z    
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">38.795937</span>Z    <span class="hljs-number">8</span> packages are looking <span class="hljs-keyword">for</span> funding
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">38.796062</span>Z      run <span class="hljs-string">`npm fund`</span> <span class="hljs-keyword">for</span> details
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">38.797004</span>Z    
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">38.797278</span>Z    found <span class="hljs-number">0</span> vulnerabilities
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">38.807557</span>Z    NPM modules installed
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">39.386013</span>Z    npm WARN config tmp This setting is no longer used.  npm stores temporary files <span class="hljs-keyword">in</span> a special
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">39.386411</span>Z    npm WARN config location <span class="hljs-keyword">in</span> the cache, and they are managed by
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">39.386585</span>Z    npm WARN config     [<span class="hljs-string">`cacache`</span>](http:<span class="hljs-comment">//npm.im/cacache).</span>
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">39.405024</span>Z    Installing Hugo <span class="hljs-number">0.54</span><span class="hljs-number">.0</span>
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">40.188585</span>Z    Hugo Static Site Generator v0<span class="hljs-number">.54</span><span class="hljs-number">.0</span>-B1A82C61A/extended linux/amd64 BuildDate: <span class="hljs-number">2019</span><span class="hljs-number">-02</span><span class="hljs-number">-01</span>T10:<span class="hljs-number">04</span>:<span class="hljs-number">38</span>Z
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">40.192651</span>Z    Started restoring cached go cache
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">40.211269</span>Z    Finished restoring cached go cache
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">40.358634</span>Z    go version go1<span class="hljs-number">.14</span><span class="hljs-number">.4</span> linux/amd64
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">40.37351</span>Z    go version go1<span class="hljs-number">.14</span><span class="hljs-number">.4</span> linux/amd64
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">40.376529</span>Z    Installing missing commands
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">40.376768</span>Z    Verify run directory
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">40.376908</span>Z    Executing user command: npm run build
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">40.853109</span>Z    npm WARN config tmp This setting is no longer used.  npm stores temporary files <span class="hljs-keyword">in</span> a special
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">40.853487</span>Z    npm WARN config location <span class="hljs-keyword">in</span> the cache, and they are managed by
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">40.853695</span>Z    npm WARN config     [<span class="hljs-string">`cacache`</span>](http:<span class="hljs-comment">//npm.im/cacache).</span>
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">40.869037</span>Z    
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">40.86927</span>Z    &gt; example-vite-react@<span class="hljs-number">0.0</span><span class="hljs-number">.0</span> build
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">40.869412</span>Z    &gt; vite build
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">40.869531</span>Z    
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">41.510958</span>Z    vite v3<span class="hljs-number">.1</span><span class="hljs-number">.7</span> building <span class="hljs-keyword">for</span> production...
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">41.547569</span>Z    transforming...
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">42.531186</span>Z    ✓ <span class="hljs-number">38</span> modules transformed.
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">42.645557</span>Z    rendering chunks...
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">42.651676</span>Z    dist/assets/react<span class="hljs-number">.35</span>ef61ed.svg   <span class="hljs-number">4.03</span> KiB
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">42.651924</span>Z    dist/assets/vite<span class="hljs-number">.4</span>a748afd.svg    <span class="hljs-number">1.46</span> KiB
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">42.652091</span>Z    dist/assets/icon<span class="hljs-number">.2129</span>a660.svg    <span class="hljs-number">4.62</span> KiB
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">42.654023</span>Z    dist/index.html                  <span class="hljs-number">0.44</span> KiB
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">42.655668</span>Z    dist/assets/index<span class="hljs-number">.1</span>d2a7c20.css   <span class="hljs-number">1.80</span> KiB / gzip: <span class="hljs-number">0.92</span> KiB
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">42.660648</span>Z    dist/assets/index.c65a176d.js    <span class="hljs-number">140.69</span> KiB / gzip: <span class="hljs-number">45.44</span> KiB
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">42.694059</span>Z    Finished
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">42.694677</span>Z    Note: No functions dir at /functions found. Skipping.
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">42.695099</span>Z    Validating asset output directory
<span class="hljs-number">2022</span><span class="hljs-number">-10</span><span class="hljs-number">-25</span>T09:<span class="hljs-number">50</span>:<span class="hljs-number">43.554269</span>Z    Deploying your site to Cloudflare<span class="hljs-string">'s global network...
2022-10-25T09:50:48.117372Z    Success: Assets published!
2022-10-25T09:50:48.584581Z    Success: Your site was deployed!</span>
</code></pre><p>Here's Vercel:</p>
<pre><code>[<span class="hljs-number">02</span>:<span class="hljs-number">50</span>:<span class="hljs-number">18.866</span>] Cloning github.com/reflame/deployment-benchmark (Branch: edit, <span class="hljs-attr">Commit</span>: <span class="hljs-number">733417</span>b)
[<span class="hljs-number">02</span>:<span class="hljs-number">50</span>:<span class="hljs-number">19.250</span>] Cloning completed: <span class="hljs-number">383.076</span>ms
[<span class="hljs-number">02</span>:<span class="hljs-number">50</span>:<span class="hljs-number">19.652</span>] Looking up build cache...
[<span class="hljs-number">02</span>:<span class="hljs-number">50</span>:<span class="hljs-number">20.970</span>] Build cache downloaded [<span class="hljs-number">11.34</span> MB]: <span class="hljs-number">1057</span>ms
[<span class="hljs-number">02</span>:<span class="hljs-number">50</span>:<span class="hljs-number">21.009</span>] Running <span class="hljs-string">"vercel build"</span>
[<span class="hljs-number">02</span>:<span class="hljs-number">50</span>:<span class="hljs-number">21.507</span>] Vercel CLI <span class="hljs-number">28.4</span><span class="hljs-number">.12</span><span class="hljs-number">-05</span>a80a4
[<span class="hljs-number">02</span>:<span class="hljs-number">50</span>:<span class="hljs-number">22.086</span>] Installing dependencies...
[<span class="hljs-number">02</span>:<span class="hljs-number">50</span>:<span class="hljs-number">22.728</span>] 
[<span class="hljs-number">02</span>:<span class="hljs-number">50</span>:<span class="hljs-number">22.729</span>] up to date <span class="hljs-keyword">in</span> <span class="hljs-number">315</span>ms
[<span class="hljs-number">02</span>:<span class="hljs-number">50</span>:<span class="hljs-number">22.729</span>] 
[<span class="hljs-number">02</span>:<span class="hljs-number">50</span>:<span class="hljs-number">22.729</span>] <span class="hljs-number">8</span> packages are looking <span class="hljs-keyword">for</span> funding
[<span class="hljs-number">02</span>:<span class="hljs-number">50</span>:<span class="hljs-number">22.729</span>]   run <span class="hljs-string">`npm fund`</span> <span class="hljs-keyword">for</span> details
[<span class="hljs-number">02</span>:<span class="hljs-number">50</span>:<span class="hljs-number">22.740</span>] Detected <span class="hljs-string">`package-lock.json`</span> generated by npm <span class="hljs-number">7</span>+...
[<span class="hljs-number">02</span>:<span class="hljs-number">50</span>:<span class="hljs-number">22.741</span>] Running <span class="hljs-string">"npm run build"</span>
[<span class="hljs-number">02</span>:<span class="hljs-number">50</span>:<span class="hljs-number">23.040</span>] 
[<span class="hljs-number">02</span>:<span class="hljs-number">50</span>:<span class="hljs-number">23.040</span>] &gt; example-vite-react@<span class="hljs-number">0.0</span><span class="hljs-number">.0</span> build
[<span class="hljs-number">02</span>:<span class="hljs-number">50</span>:<span class="hljs-number">23.040</span>] &gt; vite build
[<span class="hljs-number">02</span>:<span class="hljs-number">50</span>:<span class="hljs-number">23.041</span>] 
[<span class="hljs-number">02</span>:<span class="hljs-number">50</span>:<span class="hljs-number">23.487</span>] [<span class="hljs-number">36</span>mvite v3<span class="hljs-number">.1</span><span class="hljs-number">.7</span> [<span class="hljs-number">32</span>mbuilding <span class="hljs-keyword">for</span> production...[<span class="hljs-number">36</span>m[<span class="hljs-number">39</span>m
[<span class="hljs-number">02</span>:<span class="hljs-number">50</span>:<span class="hljs-number">23.525</span>] transforming...
[<span class="hljs-number">02</span>:<span class="hljs-number">50</span>:<span class="hljs-number">24.662</span>] [<span class="hljs-number">32</span>m✓[<span class="hljs-number">39</span>m <span class="hljs-number">38</span> modules transformed.
[<span class="hljs-number">02</span>:<span class="hljs-number">50</span>:<span class="hljs-number">24.778</span>] rendering chunks...
[<span class="hljs-number">02</span>:<span class="hljs-number">50</span>:<span class="hljs-number">24.782</span>] [<span class="hljs-number">90</span>m[<span class="hljs-number">37</span>m[<span class="hljs-number">2</span>mdist/[<span class="hljs-number">22</span>m[<span class="hljs-number">90</span>m[<span class="hljs-number">39</span>m[<span class="hljs-number">32</span>massets/react<span class="hljs-number">.35</span>ef61ed.svg  [<span class="hljs-number">39</span>m [<span class="hljs-number">2</span>m4<span class="hljs-number">.03</span> KiB[<span class="hljs-number">22</span>m
[<span class="hljs-number">02</span>:<span class="hljs-number">50</span>:<span class="hljs-number">24.783</span>] [<span class="hljs-number">90</span>m[<span class="hljs-number">37</span>m[<span class="hljs-number">2</span>mdist/[<span class="hljs-number">22</span>m[<span class="hljs-number">90</span>m[<span class="hljs-number">39</span>m[<span class="hljs-number">32</span>massets/vite<span class="hljs-number">.4</span>a748afd.svg   [<span class="hljs-number">39</span>m [<span class="hljs-number">2</span>m1<span class="hljs-number">.46</span> KiB[<span class="hljs-number">22</span>m
[<span class="hljs-number">02</span>:<span class="hljs-number">50</span>:<span class="hljs-number">24.783</span>] [<span class="hljs-number">90</span>m[<span class="hljs-number">37</span>m[<span class="hljs-number">2</span>mdist/[<span class="hljs-number">22</span>m[<span class="hljs-number">90</span>m[<span class="hljs-number">39</span>m[<span class="hljs-number">32</span>massets/icon<span class="hljs-number">.2129</span>a660.svg   [<span class="hljs-number">39</span>m [<span class="hljs-number">2</span>m4<span class="hljs-number">.62</span> KiB[<span class="hljs-number">22</span>m
[<span class="hljs-number">02</span>:<span class="hljs-number">50</span>:<span class="hljs-number">24.784</span>] [<span class="hljs-number">90</span>m[<span class="hljs-number">37</span>m[<span class="hljs-number">2</span>mdist/[<span class="hljs-number">22</span>m[<span class="hljs-number">90</span>m[<span class="hljs-number">39</span>m[<span class="hljs-number">32</span>mindex.html                 [<span class="hljs-number">39</span>m [<span class="hljs-number">2</span>m0<span class="hljs-number">.44</span> KiB[<span class="hljs-number">22</span>m
[<span class="hljs-number">02</span>:<span class="hljs-number">50</span>:<span class="hljs-number">24.785</span>] [<span class="hljs-number">90</span>m[<span class="hljs-number">37</span>m[<span class="hljs-number">2</span>mdist/[<span class="hljs-number">22</span>m[<span class="hljs-number">90</span>m[<span class="hljs-number">39</span>m[<span class="hljs-number">35</span>massets/index<span class="hljs-number">.1</span>d2a7c20.css  [<span class="hljs-number">39</span>m [<span class="hljs-number">2</span>m1<span class="hljs-number">.80</span> KiB / gzip: <span class="hljs-number">0.92</span> KiB[<span class="hljs-number">22</span>m
[<span class="hljs-number">02</span>:<span class="hljs-number">50</span>:<span class="hljs-number">24.790</span>] [<span class="hljs-number">90</span>m[<span class="hljs-number">37</span>m[<span class="hljs-number">2</span>mdist/[<span class="hljs-number">22</span>m[<span class="hljs-number">90</span>m[<span class="hljs-number">39</span>m[<span class="hljs-number">36</span>massets/index.c65a176d.js   [<span class="hljs-number">39</span>m [<span class="hljs-number">2</span>m140<span class="hljs-number">.69</span> KiB / gzip: <span class="hljs-number">45.44</span> KiB[<span class="hljs-number">22</span>m
[<span class="hljs-number">02</span>:<span class="hljs-number">50</span>:<span class="hljs-number">24.833</span>] Build Completed <span class="hljs-keyword">in</span> /vercel/output [<span class="hljs-number">3</span>s]
[<span class="hljs-number">02</span>:<span class="hljs-number">50</span>:<span class="hljs-number">25.479</span>] Generated build outputs:
[<span class="hljs-number">02</span>:<span class="hljs-number">50</span>:<span class="hljs-number">25.480</span>]  - Static files: <span class="hljs-number">9</span>
[<span class="hljs-number">02</span>:<span class="hljs-number">50</span>:<span class="hljs-number">25.480</span>]  - Serverless Functions: <span class="hljs-number">0</span>
[<span class="hljs-number">02</span>:<span class="hljs-number">50</span>:<span class="hljs-number">25.480</span>]  - Edge Functions: <span class="hljs-number">0</span>
[<span class="hljs-number">02</span>:<span class="hljs-number">50</span>:<span class="hljs-number">25.480</span>] Deployed outputs <span class="hljs-keyword">in</span> <span class="hljs-number">1</span>s
[<span class="hljs-number">02</span>:<span class="hljs-number">50</span>:<span class="hljs-number">26.134</span>] Build completed. Populating build cache...
[<span class="hljs-number">02</span>:<span class="hljs-number">50</span>:<span class="hljs-number">28.818</span>] Uploading build cache [<span class="hljs-number">11.34</span> MB]...
[<span class="hljs-number">02</span>:<span class="hljs-number">50</span>:<span class="hljs-number">29.688</span>] Build cache uploaded: <span class="hljs-number">870.16</span>ms
[<span class="hljs-number">02</span>:<span class="hljs-number">50</span>:<span class="hljs-number">29.718</span>] Done <span class="hljs-keyword">with</span> <span class="hljs-string">"."</span>
</code></pre><p>The difference in sheer length is what first jumped out to me. </p>
<p>Looking closer, Cloudflare Pages seems to be initializing Ruby, PHP, Python, Swift, in a project that doesn't use anything other than Node.js. It also appears to be downloading Node.js whereas Vercel seems to have it baked into the image.</p>
<p>Adding up the setup steps at the beginning and various other unrelated steps sprinkled throughout (why Hugo???), this seems to be responsible for ~17s of the overall latency. Cutting all of this cruft out would bring latency down to ~20s, which would put CF Pages within spitting distance of Vercel.</p>
<p>CF Pages is still a young product compared to Vercel and Netlify, and based on their track record I have little doubt they will try to optimize this cruft away in due time and become more competitive.</p>
<blockquote>
<p>Quick side note: Cloudflare Pages was the only service that required extra configuration to deploy this near-stock Vite project. There was no preset for Vite, initial deploy failed with stock settings, and I had to look up their docs to set it up with the right commands and env vars.</p>
<p>If you're setting this up to verify results for yourself, just make sure to set the build command manually to <code>npm run build</code>, the build output directory to <code>/dist</code>, and set the <code>NODE_VERSION</code> env var to <code>16</code> for it to build correctly.</p>
<p>Both Vercel and Netlify were able to deploy the repo as soon as I connected it without any manual configuration.</p>
<p>This is unrelated to our investigation into deployment speed, but I thought I'd flag this anyways in case the folks at Cloudflare are interested in knocking out this quick win.</p>
</blockquote>
<h3 id="heading-netlify">🐢 Netlify</h3>
<p>Last and most disappointingly, we have Netlify. </p>
<p>Netlify performed horrendously here. It didn't even manage to post an in-progress commit status until a good 24s after everything else was already fully deployed. And then took another ~19s seconds to finish after that, totaling ~80s of end to end latency.</p>
<p>Yet the Netlify status check proudly claims a deploy time of 15s. If we took their word for it, this would put them on par with Vercel and Cloudflare. But in reality it wildly misrepresents the end-to-end latency we would actually experience, by a factor of more than 5x.</p>
<p>This is an egregious misrepresentation by any standard, and I'd consider it downright deceptive to not include queueing time this significant in the reported result. Worse still, this queuing time is nowhere to be found even on their own build results page, so users have 0 visibility into how long their deploys end up actually taking.</p>
<blockquote>
<p>To offer a bit more nuance, it's impossible for a GitHub app to actually measure and report the exact end-to-end latency that users experience, since the earliest we can start measuring is when we receive a webhook. </p>
<p>(That is, at least if we want measurements that are resistant against clock drift, which could result in wildly inaccurate, sometimes even negative latency measurements for something like Reflame, since it completes consistently in well under 2s)</p>
<p>Vercel reported a duration of 9s, so was off by 7s. </p>
<p>Cloudflare Pages reported a duration of 33s, so was off by 4s.</p>
<p>For Reflame, the reported 1.071s is probably still off by some amount, but it was below the minimum threshold of measurement possible for this exercise, because the deploy was finished by the time we were able to see commit statuses.</p>
<p>Netlify, on the other hand, reported a duration of 15s, but had an end to end latency of 80s, so was off by 65s. It had plenty of time from receiving the webhook to measure and report the full queueing time, but chose not to in order to make itself look better by reporting only the time spent building.</p>
</blockquote>
<p>My hypothesis is this long queue time is the result of an overly aggressive cost/capacity management strategy. In a vacuum, this level of aggressive queuing could be considered perfectly reasonable, especially for users on free plans, as a cost saving measure. But in the real world we live in, there are multiple competitors offering free plans with comparable functionality and practically 0 queuing time (and with Reflame specifically, practically 0 end to end deploy time as well). </p>
<p>So if I were a Netlify user and care about deployment speed, I'd be taking <em>a serious look</em> at the many great alternatives out there if they don't end up addressing this soon. If this is in fact a cost saving measure as I hypothesized, Netlify should be able to do so just by flipping a switch and throwing some more money at the problem to become competitive with everybody else. And if they still choose not to, frankly they don't deserve to have you as a user. </p>
<p>Switching costs should be negligible if you're just deploying a simple client-rendered app composed of a bunch of static assets, and in return you'll shave off 50s+ off your deploys for the rest of the lifetime of the project (or practically the entire 80s if you choose Reflame). Feel free to DM me at <a target="_blank" href="https://twitter.com/lewisl9029">@lewisl9029</a> on Twitter if you need help with this, even if you're planning to switch to something other than Reflame. I just can't stand seeing Netlify continue to have users while offering such an egregiously subpar experience in an area I'm so passionate about.</p>
<blockquote>
<p>For the record, across all of the dozens of times I ran this test, Netlify always exhibited similar performance characteristics, and never even once posted an in-progress check before the others have completed. </p>
<p>Again, I invite anyone curious enough to try this for themselves on a new Vite React app to independently verify these results. Especially interested in seeing results from those with paid plans on Netlify, which may not suffer from queueing times to a similar degree. On the $99/member business plan, they sell something they call the "Priority build environment", after all. Not that I'd recommend paying $99/member to get the baseline experience everybody else is offering for free.</p>
</blockquote>
<p>These results were especially frustrating if we consider that Netlify <em>could</em> have placed second in this race if it didn't have this ridiculously long queuing time. From the time the status check appeared, it actually only took ~19s to complete the deployment, which would have put it neck to neck with Vercel. Instead, because it took 50s+ to even start, it finished dead last by a hilarious margin.</p>
<h2 id="heading-parting-words">Parting words</h2>
<p>Finally, allow me to end on this rant:</p>
<p>The term "instant" has lost all meaning in the deployment tooling space. Every service tested here, except Reflame, currently uses the term "instant" to describe their preview deploys:</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1668047940767/9eb0dFby1.png" alt="Screen Shot 2022-10-25 at 3.30.22 AM.png" />
<img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1668047943728/zNnIb28Zr.png" alt="Screen Shot 2022-10-25 at 3.34.13 AM.png" />
<img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1668047945736/EuqGFkhdo.png" alt="Screen Shot 2022-10-25 at 3.36.56 AM.png" /></p>
<p>This is especially ironic because as we've seen with this exercise, only Reflame is actually fast enough to offer an experience that is truly "instant" in the only useful definition of the word: an experience fast enough to never leave you waiting, even for a brief second.</p>
<p>More and more, I've been intentionally avoiding nebulous terms like "instant" to describe Reflame's deployment speed. Instead I like to present it in terms of cold, hard numbers: 100ms - 500ms from the VSCode extension, 0.5s - 3s from the GitHub app.</p>
<p>Not only is this more honest and easier on my conscience, I find that it connects with users on a more visceral level, and better piques their intellectual curiosity around how Reflame deploys so quickly, which is a great way to get them in the door. Most importantly though, it helps me remind myself that there's always room for improvement.</p>
<p>Vite's update latency for local development is best measured in <a target="_blank" href="https://github.com/yyx990803/vite-vs-next-turbo-hmr#numbers">10s of milliseconds</a>. Reflame deploys to the internet on every update for its shareable, local-dev-like, HMR-enabled Live Previews, so the speed of light in fiber optics means Reflame might never be able to beat it with its current architecture (though I have some exciting ideas for a future version of Reflame that might in the distant future!), but that won't stop me from trying to get it as close as the physical limits of universe will allow.</p>
<p>You don't have to just take my word for that either. </p>
<p>I chose a flat monthly fee per user as the pricing model for Reflame's deployment product precisely to ensure Reflame the business will always be financially incentivized to make your deploys faster, allowing us to serve more customers on the same hardware and pocket the difference as profit. This is in stark contrast to just about every other CI/CD provider out there that rely on usage-based pricing and end up financially disincentivizing themselves from doing anything that could make deploys faster for their users (<a target="_blank" href="https://www.linkedin.com/in/lewisl9029/#:~:text=Senior%20Product%20Engineer">ask me how I know this</a>). I'll probably have more to say on this topic in the future as well...</p>
<p>... But, until then, really appreciate you spending the time to read up to this point! </p>
<p>If this piece piqued your interest in Reflame, and you're building a client-rendered React app, go give it a try for free at https://reflame.app/! Would love to repay you many times over for the time you spent by making sure you never have to wait for a deploy ever again. 🙂</p>
]]></content:encoded></item><item><title><![CDATA[New in Reflame: create apps in 1 click ✨]]></title><description><![CDATA[It's been almost 2 weeks since our launch on Show HN, and I've been hard at work on some very exciting updates in response to your direct feedback, and some of the pain points I've observed.
Our headline feature for this update: 1-click app creation ...]]></description><link>https://blog.reflame.app/new-in-reflame-create-apps-in-1-click</link><guid isPermaLink="true">https://blog.reflame.app/new-in-reflame-create-apps-in-1-click</guid><category><![CDATA[React]]></category><category><![CDATA[deployment]]></category><category><![CDATA[vite]]></category><category><![CDATA[create-react-app]]></category><dc:creator><![CDATA[Lewis Liu]]></dc:creator><pubDate>Fri, 21 Oct 2022 06:28:30 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1666333622397/Qk5gaXuRY.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>It's been almost 2 weeks since <a target="_blank" href="https://blog.reflame.app/reflames-launch-was-a-great-success">our launch</a> on <a target="_blank" href="https://news.ycombinator.com/item?id=33134059">Show HN</a>, and I've been hard at work on some very exciting updates in response to your direct feedback, and some of the pain points I've observed.</p>
<h1 id="heading-our-headline-feature-for-this-update-1-click-app-creation">Our headline feature for this update: 1-click app creation ✨</h1>
<p>Here's how quickly we can create a React app built with Reflame + Vite (oops, spoiler alert!), along with a brand new, pre-connected GitHub repo that Reflame will watch to deploy previews and production instantly on every change:</p>
<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://youtu.be/Fiz7qI0ziGo">https://youtu.be/Fiz7qI0ziGo</a></div>
<p>Choosing the non-default vite-react Foundation (uh oh, more spoilers!) required a few extra clicks, but you get the idea.</p>
<p>If you were one of the folks who ran into issues trying to create apps with the old flow, or gave up half way because of how clunky the experience was, please <a target="_blank" href="https://reflame.app/dashboard/create-app">give this new flow a shot</a>, and let me know what you think!</p>
<h1 id="heading-vite-compatibility">Vite compatibility!</h1>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1666330407723/-GbDEwrbk.png" alt="Screen Shot 2022-10-20 at 7.25.25 PM.png" /></p>
<p>Here's an <a target="_blank" href="https://github.com/reflame/example-vite-react">example repo</a> showcasing what Vite compatibility with Reflame looks like.</p>
<p>Essentially, Vite and Reflame can now live in the same codebase. So we can keep using Vite for local dev, and any Vite-compatible deployment solution for deploying previews and production while trying out Reflame. </p>
<p>Vite compatibility should also alleviate some of your very valid concerns around lock-in with Reflame, now that we have a well-supported migration path towards the state-of-the-art alternative (previously we only had Create React App compat) in case your experiment with Reflame doesn't work out.</p>
<p>If lock-in was the only thing making you hesitant to try out Reflame before, wait no longer! Follow <a target="_blank" href="https://reflame.app/dashboard/create-app?foundationName=vite-react">this link</a> to create a new app with Reflame and Vite in 1 click. If you still have other concerns, I'd love to hear about them!</p>
<h1 id="heading-foundations-experimental">Foundations (experimental)</h1>
<p>Time for a rant: </p>
<p>I've never been a huge fan of presets. They make it very easy to get a new app started (which is great! 🎉), but they do little to help us improve our apps over the long term as improvements are made to the preset itself (not as great 😞).</p>
<p>Despite all its shortcomings in other areas compared to its more modern peers, Create React App still excels in this aspect today, meticulously documenting every change and maintaining detailed migration guides between every version in its <a target="_blank" href="https://github.com/facebook/create-react-app/blob/main/CHANGELOG.md">changelog</a>. </p>
<p>Vite takes a step backwards here by reverting to a more hands-off, preset-based approach that places the burden of keeping up with preset updates back on the shoulders of users. Luckily, all the existing presets are very simple and most have not experienced major changes yet, but this approach has never stood the test of time.</p>
<p>With foundations in Reflame, I intend to take the commitment to supporting users over the long term further than even CRA ever could, by taking advantage of the fact that Reflame is a SaaS with tight integration into your source control. </p>
<p>Foundations powers all of the new app creation features you just read about above. We have 4 foundations today, and you can see example repos for each in our <a target="_blank" href="https://github.com/reflame">GitHub page</a>. </p>
<p>Foundations are versioned, so when a foundation is changed in a material way (i.e. in ways we believe lots of users could benefit from), we could not only bump the version and provide a detailed changelog and migration guide, but also try to make the migration process as seamless as possible for our users through automated PRs to apply those updates with the click of a button!</p>
<p>To set expectations, this is very much still an early experiment, and work on this will have to be balanced against the many other things on our roadmap. So if this is something you'd like to see us explore further in Reflame (or alternatively if you'd rather see us focusing on other things), I'd love to hear from you! Any feedback we receive will help inform how we prioritize this exciting new exploration.</p>
<h1 id="heading-handling-unhappy-paths">Handling unhappy paths...</h1>
<p>Reflame was launched as a Minimum Viable Product, with a barely working happy path, and very little consideration for the many unhappy paths that users could encounter along the way. </p>
<p>Now that Reflame can be used by anyone who signs up for an account, one really bad experience could be the end of that relationship, so making the worst case experience slightly better for everybody has been a huge priority for me these past few weeks:</p>
<h2 id="heading-more-better-error-boundaries">More, better error boundaries</h2>
<p>Some of you might remember running into a blank, unstyled page with the message "Oops, something has gone horribly wrong" whenever an error occurred. The fact that it's unstyled and looks really jarring is obviously not great, but what's worse is that it'd also wipe out the support chat widget so you can't even use it to report the issue! </p>
<p>This happened because we only had one error boundary wrapping the entire app at the top-most root, and we use a <a target="_blank" href="https://github.com/calibreapp/react-live-chat-loader">React library</a> to load our support chat provider lazily on interaction further down in the tree.</p>
<p>Here's how the new error boundary works:</p>
<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://youtu.be/Xck1stwb7u0">https://youtu.be/Xck1stwb7u0</a></div>
<p>First thing you'll notice is that it's no longer a blank page with unstyled text. We're off to a great start! 😌</p>
<p>Also note that both the chat widget and the navbar is preserved. This is because we now have multiple layers of error boundaries handling errors at different locations in the component tree. A page level error like in this case will be caught by the boundary immediately outside, so will preserve everything higher up in the tree. Any navigation will also reset the error boundary so users are not stuck there after navigating to a new page that will likely not fail in a similar way.</p>
<h2 id="heading-status-checks-on-failed-deploys">Status checks on failed deploys</h2>
<p>Previously, when our GitHub App fails to deploy a commit for some unforeseen reason, nothing would happen from the user's perspective. The commit will just stay without a status check forever. It's obviously not great to leave the user hanging in a limbo like this. Our whole selling point is making sure you never have to wait for a deploy, yet here we are making you wait potentially forever 🤦‍♂️.</p>
<p>So we fixed that, and added a shiny new deploy error reporting flow:</p>
<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://youtu.be/oM4T-KIRL4Y">https://youtu.be/oM4T-KIRL4Y</a></div>
<p>This generic error status check now shows up immediately on any deploy error that hasn't yet been properly handled with specialized logic, along with a link to open a support ticket with a trace ID attached to make it easier for us to look directly into the offending deploy.</p>
<p>And last but certainly not least...</p>
<h2 id="heading-traces-traces-everywhere">Traces, traces everywhere 🧐</h2>
<p>Prior to the launch, most errors that I needed to look into occurred in the service responsible for actually processing incoming updates and deploying them. This made sense because most of my users were already fully onboarded and working on their apps. </p>
<p>So, this is where I focused most of my observability efforts, with an elaborate Open Telemetry tracing setup, including spans for every non-trivial operation to keep a close eye on where most of the latency budget is being spent on each deployment request.</p>
<p>Here's the shape of a typical trace for that service:</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1666331669588/AJcFxWZqn.png" alt="Screen Shot 2022-10-20 at 8.03.30 PM.png" /></p>
<p>This was invaluable for the deployment service, but I didn't really have a burning need to add a similar level of observability for anything else... until the launch. Suddenly the service having the most problems was the rather simple service serving up the APIs used by the dashboard. </p>
<p>Luckily, this was painful enough to force me to do the right thing within the day of the launch, and add full Open Telemetry tracing to the API service, as well as taking time to improve the existing tracing setup by exposing trace IDs for every request in response headers so we always know where to look for any problematic request.</p>
<p>This allowed me to squash a whole bunch of nasty bugs users ended up running into later in the launch period. If you tried to onboard and were blocked by some cryptic error, please give us another chance! If I didn't manage to crush the bug you ran into last time around, I'll certainly be better positioned to do it this time. 🙏</p>
<h1 id="heading-other-updates">Other updates</h1>
<h2 id="heading-new-activation-shortcut-cmdctrlshifte-for-the-vscode-extension">New activation shortcut (cmd/ctrl+shift+e) for the VSCode extension</h2>
<p>Previously, all of the keyboard shortcuts in our VSCode extension were two-part combos prefixed with cmd/ctrl+shift+a. This worked pretty well inside VSCode, since it didn't shadow any important built-in functionality there, but I failed to account for the fact that cmd/ctrl+shift+a brings up tab search in Chromium-based browsers. </p>
<p>Eventually I want to get the Reflame VSCode extension working in web-based editors like vscode.dev and StackBlitz (it already works for remotely hosted editors like GitHub Codespaces, give it a try!), and shadowing such a useful browser shortcut feels like it's going to cause a lot of frustration over the long term. So, I went on the search for a new shortcut that's easy to activate with 1 hand, and eventually landed on cmd/ctrl+shift+e. </p>
<p>I realize this will cause some pain in the short term for users who have built up muscle memory for the previous shortcut (trust me, I feel your pain intensely), but I felt it's better to rip off the bandaid early than to ignore the problem and watch it grow into something we can no longer afford to fix.</p>
<h2 id="heading-support-for-jsx-files">Support for .jsx files</h2>
<p>Reflame follows Create React App in accepting JSX in .js files, because having to rename files just to add JSX to it introduced too much friction to be worth the potential perf gains, in my opinion (and using .jsx for everything to begin with would defeat the whole point of having separate extensions in the first place). TypeScript enforced that JSX could only exist in .tsx files, so we already supported .tsx files, but there was no similar forcing function for .jsx support, so we never added it.</p>
<p>Vite ended up being this forcing function, since it only supported JSX in .jsx files, so we implemented it. Luckily this was the only new features we needed to add to the deployment pipeline for Vite support, since most of the other pieces were already in place from our previous work on Create React App support.</p>
<h2 id="heading-differentiate-commit-previews-from-latest-branch-previews">Differentiate commit previews from latest branch previews</h2>
<p>Before:</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1666332011398/buT8xORtT.png" alt="Screen Shot 2022-10-20 at 8.44.44 PM.png" /></p>
<p>After:</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1666332019362/AFKbrGHyz.png" alt="Screen Shot 2022-10-20 at 8.44.22 PM.png" /></p>
<p>These are the previews you get by clicking on a link from a GitHub commit status. Some users ended up getting confused by this because they expected to see the latest commit on the branch when they refreshed (which was a very reasonable expectation given the unfortunate previous wording!). </p>
<p>Hopefully this change will remove that confusion going forward.</p>
<h2 id="heading-link-to-production-on-deploys-from-default-branch">Link to production on deploys from default branch</h2>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1666332037268/9i3kulve3.png" alt="Screen Shot 2022-10-20 at 8.49.04 PM.png" /></p>
<p>The Production link here will actually clear all of your preview cookies so you're guaranteed to see the same version of the app as customers would. </p>
<p>Note that production deploys have an extra caching layer for serving performance, and there can be some replication delay for regions outside of where the change was deployed (currently all GitHub webhooks seem to be coming in through US East), so you may see a stale version of the app for up to a minute.</p>
<h2 id="heading-added-several-missing-loading-states-for-form-submissions">Added several missing loading states for form submissions</h2>
<p>Some forms were not properly implemented and ended up leaving loading state as soon as they were clicked, making it look like the submit never happened (even though it was still happening in the background and would complete if given enough time), and worse, making it possible for users to send duplicate submissions, sometimes leaving the user in weird states. This has been fixed.</p>
<h2 id="heading-added-specific-error-handlingmessaging-for-several-common-issues">Added specific error handling/messaging for several common issues</h2>
<p>Examples: App name taken, user name taken, importing empty repo, etc. </p>
<h1 id="heading-final-words">Final words</h1>
<p>And... that concludes our very first product update! 🥳</p>
<p>How did you find it? Too long probably? 😅</p>
<p>Brevity is not my strong suit, and I've already spent the entire afternoon writing and then cutting for hours on end, so I really want to just head back to coding now. Will try harder to write less and/or cut more next time! 🙊</p>
<p>See you again next month! 👋</p>
]]></content:encoded></item><item><title><![CDATA[Reflame's launch was a great success! 🎉]]></title><description><![CDATA[This was ripped from an email I sent to users last week. Sharing here for completeness since I intend to cross-post product updates and other announcements here from now on.

In case you haven't heard yet, Reflame was officially launched on Show HN l...]]></description><link>https://blog.reflame.app/reflames-launch-was-a-great-success</link><guid isPermaLink="true">https://blog.reflame.app/reflames-launch-was-a-great-success</guid><category><![CDATA[React]]></category><category><![CDATA[deployment]]></category><dc:creator><![CDATA[Lewis Liu]]></dc:creator><pubDate>Fri, 14 Oct 2022 02:01:22 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1666316750614/GJvr2hMii.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<blockquote>
<p>This was ripped from an email I sent to users last week. Sharing here for completeness since I intend to cross-post product updates and other announcements here from now on.</p>
</blockquote>
<p>In case you haven't heard yet, <a target="_blank" href="https://reflame.app">Reflame</a> was officially launched on <a target="_blank" href="https://news.ycombinator.com/item?id=33134059">Show HN</a> last week. This means anybody with an account (including you!) can now use Reflame to deploy React web apps instantly, completely free!</p>
<p>The reception to the launch was better than I could have ever hoped for. Our post got to <a target="_blank" href="https://hnrankings.info/33134059/">6th place</a> on the front page, and stayed at #1 on the Show HN page for the entire day. This graph of unique visits to https://reflame.app over the last 7 days tells the story of the launch much better than I ever could in words:</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1666316659666/s_a8nynoG.png" alt="cl97umfnq005amg099a7qg3d1.png" /></p>
<p>These 5.2K unique visits turned into over 100 new signups (and counting), which is more than double the size of our existing user base. I've already seen the beginnings of some very cool apps getting built on Reflame by these new users (perhaps you're one of them?), which makes me deeply excited about the future!</p>
<p>Whether you're a long-time user who helped improve Reflame with your feedback along our journey up to this point, or a new user who supported our launch by upvoting our Show HN post and signing up, I'd like to extend to you my deepest gratitude. </p>
<p>Your time is valuable, and I will never take for granted the fact that you shared it with me to try out Reflame. Over the next few months and years, I hope to repay you for that time many times over by <em>making sure you never have to wait for another deploy ever again</em>. </p>
<p>I can't wait to see what we'll build together! :)</p>
]]></content:encoded></item></channel></rss>