Skip to main content Redeem Tomorrow
I used to be excited about the future. Let's bring that back.
About Hire me

Retrospective on a dying technology cycle, part 4: What comes next?

I quit my job a few weeks ago.

Working with the crew at Glitch.com was a highlight of my career. Making technology and digital expression more accessible is one of my core drives. Nobody does it better than they do.

But as the cycle reached its conclusion, like so many startups, Glitch found itself acquired by a larger company. Bureaucracies make me itchy, so I left, eager to ply my trade with startups once again.

Then the Silicon Valley Bank crisis hit.

It was a scary moment to have thrown my lot in with startups. Of course, we know now that the Feds guaranteed the deposits. Once I mopped the sweat from my brow, I got to wondering: what comes next?

The steam of the last cycle is largely exhausted. Mobile is a mature platform, full of incumbents. The impact of cloud computing and open source on engineering productivity is now well understood, priced into our assumptions. The interest rate environment is making venture capital less of a free-for-all.

Meanwhile, innovators have either matured into more conservative companies, or been acquired by them.

So a new cycle begins.

Why disruption is inevitable

Large companies have one job: maintaining a status quo. This isn’t necessarily great for anyone except the companies’ shareholders, and as Research in Motion found when Blackberry died, even this has limits. Consumers chafe against the slowing improvement and blurring focus of products. Reality keeps on moving while big companies insist their workers make little slideshows and read them to each other, reporting on this and that OKR or KPI.

And suddenly: the product sucks ass.

Google is dealing with this today. Search results are a fetid stew of garbage. They’re choked with ads, many of which are predatory. They lead to pages that aren’t very useful.

Google, once beloved as the best way to access the web, has grown complacent.

Facebook, meanwhile, blew tens of billions—hundreds of times the R&D costs of the first iPhone—to create a VR paradigm shift that never materialized. Their user populations are aging, which poses a challenge for a company that has to sell ads and maintain cultural relevance. They can’t quite catch up to TikTok’s mojo, so they’re hoping that lawmakers will kill their competitor for them.

Twitter has been taken private, stripped down, converted into the plaything of the wealthy.

Microsoft now owns GitHub and LinkedIn.

Netflix, once adored for its innovative approach to distribution and original content, has matured into a shovelware play that cancels most of its shows after a season or two.

Apple now makes so many billions merely from extracting rents on its App Store software distribution infrastructure, that could be a business unto itself.

A swirling, dynamic system full of energy has congealed into a handful of players maintaining their turf.

The thing is, there’s energy locked in this turgid ball of mud. People eager for things that work better, faster, cheaper. Someone will figure out how to reach them.

Then they will take away incumbents’ money, their cultural relevance, and ultimately: their power. Google and Facebook, in particular, are locked in a race to become the next Yahoo: decaying, irrelevant, coasting on the inertia of a dwindling user base, no longer even dreaming of bygone power.

They might both win.

“Artificial” “intelligence”

Look, AI is going to be one of the next big things.

You can feel how you want to feel about it. A lot of how this technology has been approached—non-consensual slurping of artists’ protected works, training on internet hate speech—is weird and gross! The label is inaccurate!

This is not intelligence.

Still: whatever it is, is going to create some impact and long-term change.

I’m more comfortable calling “AI” a “pattern synthesis engine” (PSE). You tell it the pattern you’re looking for, and then it disgorges something plausible synthesized from its vast set of training patterns.

The pattern may have a passing resemblance to what you’re looking for.

But even a lossy, incomplete, or inaccurate pattern can have immediate value. It can be a starting point that is cheaper and faster to arrive at than something built manually.

This is of particular interest to me as someone who struggles with motivation around tedious tasks. Having automation to kickstart the process and give me something to chip away at is compelling.

The dawn of the PSE revolution is text and image generation. But patterns rule everything around us. Patterns define software, communications, design, architecture, civil engineering, and more. Automation that accelerates the creation of patterns has broad impact.

Indeed, tools like ChatGPT are already disruptive to incumbent technologies like Google. I have found it faster and more effective to answer questions around software development tasks—from which tools and frameworks are viable for a task, to code-level suggestions on solving problems—through ChatGPT, instead of hitting a search engine like Google. Microsoft—who you should never, ever sleep on—is seizing the moment, capturing more headlines for search than they have in decades.

Still, I don’t valorize disruption for its own sake. This is going to make a mess. Pattern synthesis makes it cheaper than ever to generate and distribute bullshit, and that’s dangerous in the long term. It won’t stop at text and images. Audio, voices and video are all patterns subject to synthesis. It’s only a matter of time and technological progression before PSE’s can manage their complexity cheaply.

On the other hand, many tedious, manual tasks are going to fall before this technology. Individuals will find their leverage multiplied, and small teams will be able to get more done.

The question, as always for labor savings, will be who gets to keep the extra cream.

Remote work

Most CEOs with a return-to-office fetish are acting the dinosaur and they’re going to lose.

Eventually.

Feeling power through asses-in-seats is a ritualistic anachronism from a time where telecommunications was expensive and limited to the bandwidth of an analog telephone.

Today, bandwidth is orders of magnitude greater, offering rich communication options without precedent. Today, the office is vulnerability.

In a world more and more subject to climate change, a geographically distributed workforce is a hedge. Look at the wildass weather hitting California in just the last few months. You’ve got cyclones, flooding, extreme winds, power outages. And that’s without getting into the seasonal fires and air pollution of recent years.

One of the great lessons of Covid was that a knowledge workforce could maintain and even exceed their typical productivity even in a distributed context. Remote work claws back time and energy that goes to commuting, giving us more energy to use on the goals and relationships that matter most.

Still, even if remote work is the future, much of it has yet to arrive. The tools for collaboration and communication in a distributed context remain high-friction and even alienating. Being in endless calls is exhausting. There’s limited room for spontaneity.

The success stories of the new cycle will be companies nimble enough to recruit teams from outside their immediate metro area, clever enough to invent new processes to support them, and imaginative enough to productize this learning into scalable, broadly applicable solutions that change the shape of work.

The future does not belong to Slack or Zoom.

The next iteration of the web

One of the most important things Anil Dash ever told me came on a walk after we discussed a role at Glitch.

He explained that he was sympathetic to the passion of the web3 crowd, even if he thought it was misplaced. After all, who could blame them for wanting a version of the web they could shape? Who could begrudge yearning for a future that was malleable and open, instead of defined by a handful of incumbents?

I’ve thought about it endlessly ever since.

While I argue that web3 and its adjacent crypto craze was a money grab bubble inflated by venture capital, I also agree with Anil: there has to be something better than two or three major websites defining the very substrate of our communication and culture.

In the wake of Twitter’s capture by plutocrats, we’ve seen a possible answer to this hunger. Mastodon, and the larger ecosystem of the Fediverse, has seen a massive injection of energy and participation.

What’s exciting here is a marriage of the magic of Web 1.0—weird and independent—with the social power that came to define Web 2.0. A web built on the Fediverse allows every stakeholder so much more visibility and authorship over the mechanics of communication and community compared to anything that was possible under Facebook, Twitter or TikTok.

This is much the same decentralization and self-determination the web3 crowd prayed for, but it’s built on mundane technology with reasonable operating costs.

Excess computing capacity

Meanwhile, the cost of computing capacity is changing. While Amazon dominated the last cycle with their cloud offerings, the next one is anybody’s game.

Part of this is the emergence of “edge computing,” “serverless functions,” and the blurring definition of a static vs. dynamic site. These are all artifacts of a simple economic reality: every internet infrastructure company has excess capacity that they can’t guarantee on the same open-ended basis as a full virtual machine, but which they’d love to sell you in small, time boxed slices.

Capital abhors an un-leveraged resource.

As this computing paradigm for the web becomes more commonly adopted, the most popular architectures of web technologies will evolve accordingly, opening the door to new names and success stories. As Node.js transformed web development by making it accessible to anyone who could write JavaScript, look for these technologies to make resilient, scalable web infrastructure more broadly accessible, even to non-specialists.

Reality, but with shit in front of your eyes

Everything is about screens because the eyes are the highest bandwidth link we have for the brain without surgical intervention, and even that remains mostly science fiction.

The iPhone was transformational in part because of its comparatively enormous display for the day, and in the years since phones have competed by developing ever-larger, more dense displays.

Part of the logic of virtual reality, which obscures your vision, and augmented reality, which overlays it, is to take this evolution to its maximalist conclusion, completely saturating the eye with as much information as possible.

Whether this has a long-term sustainable consumer application remains to be proven. There are serious headwinds: the energy cost of high-density, high-frequency displays, and the consequent weight of batteries needed to maintain the experience. There’s the overall bulk of the devices, and the dogged friction of miniaturizing so many components to reach fashionable sizes.

But they’re sure going to try. As noted, Facebook has already blown tens of billions, plus a rebrand, trying to make VR happen. Valve did their best to create both the software delivery and hardware platform needed to spur a VR revolution, with limited success. Sony and Samsung have each dabbled.

This year, rumors suggest Apple will enter the fray with their own take on the headset. Precedent allows that they might find traction. With the iPad, Apple entered the stagnant tablet market nearly a decade late with an offering people actually loved.

While tablets didn’t transform computing altogether, Apple made a good business of them and for many, they’ve become indispensable tools.

If indeed VR/AR becomes a viable paradigm for consumer computing, that will kick off a new wave of opportunity. It has implications for navigation, communication, entertainment, specialized labor, and countless other spaces. The social implications will be broad: people were publicly hostile to users of Google Glass—the last pseudo-consumer attempt in this space. Any successful entrant will need to navigate the challenge of making these devices acceptable, and even fashionable.

There are also demonic labor implications for any successful platform in this space: yet more workplace surveillance for any field that broadly adopts the technology. Imagine the boss monitoring your every glance. Yugh.

Lasting discipline, sober betting, and a green revolution

In a zero interest rate environment, loads of people could try their hands at technology speculation. If interest rates hold at their existing altitude, this flavor of speculation is going to lose its appeal.

The talented investor with a clear eye toward leverage on future trends will have successes. But I think the party where you bet millions of dollars on ugly monkey pictures is over for awhile.

But there are no guarantees. Many seem to be almost pining for a recession—for a reset on many things, from labor power to interest rates. We’ve had a good run since 2008. It might just happen.

Yet the stakes of our decisions are bigger than casino games. The project of humanity needs more than gambling. We need investment to forestall the worst effects of climate change.

Aggressive investment in energy infrastructure, from the mundane of household heat pumps, to the vast chess game of green hydrogen, will take up a lot of funding and mental energy going forward. Managing energy, making its use efficient, maintaining the transition off of fossil fuels—all of this has information technology implications. Ten years from now, houses, offices and factories will be smarter than ever, as a fabric of monitoring devices, energy sources and HVAC machines all need to act in concert.

The old cycle is about dead. But the next one is just getting started. I hope the brightest minds of my generation get more lasting and inspiring work to do than selling ads. Many of the biggest winners of the last cycle have fortunes to plow into what comes next, and fond memories of the challenges in re-shaping an industry. There’s another round to play.

We have so much still to accomplish.


Retrospective on a dying technology cycle, part 4: What comes next? Retrospective on a dying technology cycle, part 4: What comes next?