Vercel, Creator of Next.js, Raises $150M

Vercel, Creator of Next.js, Raises $150M

Announcing $150M to build the end-to-end platform for the modern Web

Our mission is to make the Web. Faster. We’re excited to announce $150 million in Series D funding at a valuation of over $2.5 billion. We’ll use this funding to accelerate how we:

Support open-source projects

As a company with deep roots in open-source technology, we believe in the power of communities to create a faster Web. We’ve seen this first-hand with Next.js, a collaboration of more than 1,800 developers with now over 2 million weekly installs.In addition to creating and maintaining Next.js and other open-source projects, we sponsor popular libraries and frameworks like Nuxt, Astro, webpack, Babel, NextAuth, Parcel, and Unified. We work with hundreds of communities across the Web to donate free Vercel accounts to support their projects. Most recently, we welcomed Rich Harris to Vercel to work full-time on Svelte, the powerful open-source framework.

We’ll continue to support projects that contribute to a better Web ecosystem.

Make the Web Edge-first

We believe the modern Web is at the Edge and embraces the open Web platform.Our customers are already embracing this new paradigm of Web development rooted in Edge-aware capabilities. Since our Series A in April 2020, traffic on the Vercel Edge Network has grown over 700%. Large ecommerce organizations are realizing agile and tailored Web experiences are the future of buying online. Startups are building their products with an Edge-first mentality from the beginning because they know that’s where the Web is going and should be.Last month we released Edge Functions, which unlock the ability to run normally-complex personalization without performance implications or client-side burden. But this is just the beginning of what’s possible when your code and infrastructure are deeply integrated.

TypeScript 4.5 Released

TypeScript 4.5 Released

Today we’re excited to announce the release of TypeScript 4.5!

If you’re not yet familiar with TypeScript, it’s a language that builds on JavaScript by adding statically checked types. When you use static types, you can run the TypeScript compiler to check for bugs like typos and mismatches in the shapes of your data, and get handy suggestions. These types don’t change your program, and you can remove them to leave you with clean, readable JavaScript. Going beyond catching bugs in your code, TypeScript also assists you in writing code because types can power useful tooling like auto-complete, go-to-definition, and renaming in your editor! You can read more on our website.

To get started using TypeScript 4.5, you can get it through NuGet, or use npm with the following command:

npm install typescript

Understanding why our build got 15x slower with Webpack 5

Understanding why our build got 15x slower with Webpack 5

The problem

A while back, we encountered an odd problem. All of a sudden, building our front-end went from taking a few seconds to taking a few minutes. We felt this most acutely when starting our front-end development server. Incremental changes were still fast, but if you needed to stop and restart the server, it took forever. We don’t do this very often in a typical day, but it turns out that having to wait for two minutes plus, even if it’s only a few times in a day, really does start to slow a team down.

The deep dive

We started, like we often do, with git bisect. It helped us track down the set of changes that appeared to have introduced the issue; adding a Dark Mode to Tines. “This is bizarre,” we thought, “how could a bunch of CSS changes have slowed things down?”.

Our next step was to profile the build locally using Node DevTools. We use Webpack, so attaching a DevTools instance was as simple as running the following command and opening chrome://inspect/#devices.

node --inspect-brk ./node_modules/.bin/webpack

Using the Bottom-Up view of aggregate time spent in functions, we quickly saw that a significant amount of time was being spent in a getDependencies function in Linaria.

As we described before, Linaria is a CSS-in-JS library we use to style our React components. It’s novel in that it does all of its work at build-time, leaning heavily on CSS variables at runtime to achieve this. Our Dark Mode implementation involved creating a colour system and encoding this in CSS variables. Here’s a minimal example.

import { styled } from "@linaria/react";
import React from "react";
import { render } from "react-dom";

const colors = {
  darkblue: "#00008b",
  ivory: "#fffff0"

const App = styled.div`
  :global() {
    body {
      --textColor: ${colors.darkblue};
      --backgroundColor: ${colors.ivory};

      @media (prefers-color-scheme: dark) {
        --textColor: ${colors.ivory};
        --backgroundColor: ${colors.darkblue};

  color: var(--textColor);
  background: var(--backgroundColor);

render(<App>Testing 1, 2, 3...</App>, document.querySelector("#root"));

At this point, we felt like we could be looking at a bug in Linaria (or maybe Webpack), so we created a minimal test case. This allowed us to open an issue, and continue debugging with a little more clarity.

Interestingly, the test case allowed us to establish that the slowdown only occurred on Webpack 5, not Webpack 4. This got us thinking that this might, in fact, be an issue in Webpack.

Next we focused in on getDependencies in Linaria.

getDependencies(nodes) {
  return nodes.reduce(
    (acc, node) => acc.concat(Array.from(this.dependencies.get(node) || [])),

With liberal use of console.timer and console.timeEnd, we established that concat seemed to be where all of the time was going on Webpack 5. We’d validated that the function was being called in an identical manner on Webpack 4 and 5. This didn’t make sense. We were stumped.

Looking at the function, it did seem that there was room for optimisation. “Maybe all that object creation is resulting in more GC on Webpack 5?” We went about refactoring the function to remove concat and build the dependency array in place. Lo and behold, performance in our test case went from 16s to 1s.

We submitted a fix with the following note:

I’m pretty surprised that this has any impact, I don’t have a good rationale for why but it appears to be dramatically faster to update the array in place.

While we were glad to have addressed our issue, we were very unsatisfied with the fix and our understanding (or lack thereof) of the problem. We’d a gut feeling that the root cause here might be an interesting one, so we decided to put some more time into better understanding the problem

The root cause

At this point, we were pretty suspicious of Array.concat, so we thought we’d try to better understand how it behaved. As a built-in, this required delving into Node and V8. After poking around in the source for a while, one of the first things that stood out was the presence of Fast_ArrayConcat and Slow_ArrayConcat paths. “Could we be going down a slow path on Webpack 5?”, we wondered. “Only one way to find out.”

Building Node from source is pretty straightforward. We got it up and running in about 20 minutes. It’s quite some time since either of us had worked with C++, but we remembered enough to find our way around, add some logging and rebuild. Sure enough, our test case was going down the slow path on Webpack 5, but not on Webpack 4. But why?

Looking at the implementation of Fast_ArrayConcat it bailed immediately if a call to Protectors::IsIsConcatSpreadableLookupChainIntact returned false. As you might guess from its name, this function returns false if Symbol.IsConcatSpreadable has been set. Now, Paddy and I have been writing JavaScript near-daily for quite a number of years, and this was the first either of us had heard of Symbol.IsConcatSpreadable. In a nutshell, it allows you to control how arguments to concatare spread. “Could this be the culprit?”

A quick search of the Webpack source yielded a single use of Symbol.IsConcatSpreadable. What’s more, it was added in Webpack 5 in order to maintain backwards compatibility with Webpack 4. While migrating some structures from Array to Set, the Webpack authors required that concat on their new Set structures spread parameters just like Array. A perfectly reasonable use of the symbol.

We quickly removed the usage and our test case performance went from 16s to ~1s.

This was great, but we were still confused. None of the objects in getDependencies had Symbol.IsConcatSpreadable set. “So why are we going down this slow path?” We looked at Protectors::IsIsConcatSpreadableLookupChainIntact again and found something that surprised us: “Protectors” in V8 operate only at the Engine level.

💡 What this means is, if Symbol.IsConcatSpreadable has been set anywhere in your application, on any object, to any value, any subsequent usage of concat will go down the slow path on Node and Chrome.

const dummy = [];
dummy[Symbol.isConcatSpreadable] = true;
// Slow path on Node and Chrome

So, we had our answer. Our Dark Mode implementation (and another recent change in Linaria) had ballooned the number of nodes in Linaria’s dependency graph. Calls to concat had effectively always been slow, but this change exacerbated the problem to the point where we noticed.

Article source:

Angular proposal would make NgModules optional

Angular proposal would make NgModules optional

The Angular development team considers a shift toward a simpler reuse model to improve the developer experience.

The developers of Angular, the popular TypeScript-based web framework, are considering making NgModules optional. The goal is to shift Angular toward a simpler reuse model to improve the developer experience.

An RFC (request for comment) proposal being floated on GitHub, titled “Standalone components, directives and pipes — making Angular’s NGModules optional,” seeks to validate the intended design with the community, solicit feedback, and enable experimentation via a non-production-ready prototype. The RFC was posted October 8 and is slated to close no sooner than November 8.

A core Angular concept, NgModules are containers of functionality used to manage dependencies in the framework. When a component needs to make use of another component, directive, pipe, or provider, the dependency is not directly referenced. Instead, an NgModule is imported, containing exported components, directives, and pipes, along with configured providers. As a result, developers must create an NgModule to bootstrap even the simplest “Hello, World” application.

Having Angular conceptually centered around NgModule negatively impacts the developer experience, the RFC argues. The RFC cites multiple reasons:

  • Authoring of components is more involved than coding a class or template.
  • APIs around loading and rendering are unnecessarily complex and easy to misuse.
  • Reading component code is not sufficient to understand component behavior.
  • Angular tools must deal with “implicit” dependencies of components on their NgModule context.

The RFC calls for moving Angular in a direction where components, directives, and pipes play a more central role, are self-contained, and can be safely imported and used directly. The plan would simplify the “mental model” of Angular, make new APIs possible for using components and directives (such as an API for fine-grained lazy loading), and improve the ability of Angular tools to efficiently process code. Thus Angular would be simpler to use, easier to reason about, less verbose, and faster to compile.

The concept of the NgModule would not be removed from Angular, but would be made optional for typical development tasks. At some point, the Angular development team could consider removing NgModules altogether. Meanwhile, the proposal stresses that NgModules are neither going away nor being deprecated. Developers would continue to be able to create them and use them.

Angular itself continues to move forward while the development team ponders the NgModule change. Angular 13 is due next week, completing the transition to the Ivy compilation and rendering engine. Angular 12, published on June 24, also focused on Ivy.