Slow Websites

The web has grown bigger. Both in expansiveness and weight. Nick Heer’s “The Bullshit Web”:

The average internet connection in the United States is about six times as fast as it was just ten years ago, but instead of making it faster to browse the same types of websites, we’re simply occupying that extra bandwidth with more stuff.

Nick clearly explains what he means by bullshit, and one can see a connection to Brad Frost’s similarly framed argument. Nick talks about how each incremental interaction is a choice and connects the cruft of the web to the rise and adoption of frameworks like AMP.

Ethan Marcotte paints things in a different light by looking at business incentive:

…ultimately, the web’s performance problem is a problem of profitability. If we’re going to talk about bloated pages, we should do so in context: in the context of a web where digital advertising revenue is cratering for publishers, but is positively flourishing for Facebook and Google. We should look at the underlying structural issues that incentivize a company to include heavy advertising scripts and pesky overlays, or examine the market challenges that force a publisher to adopt something like AMP.

In other words, the way we talk about slow websites needs to be much, much broader. If we can do that, then we’ll have a sharper understanding of where—and how—the web can be faster.

It’s a systemic state of the industry problem that breeds slow websites. The cultural fight to fix it is perhaps just as important as the technical fights. Not that there isn’t a lot to learn and deal with on a technical level.

Addy Osamai wrote up a deep dive (a 20-minute read, according to Medium) that explores the cost of JavaScript to overall web performance. Everyone seems to agree JavaScript is the biggest problem area for slow websites. It’s not preachy but rather a set of well-explained principles to follow in this era where the use of JavaScript is trending up:

  • To stay fast, only load JavaScript needed for the current page.
  • Embrace performance budgets and learn to live within them.
  • Learn how to audit and trim your JavaScript bundles.
  • Every interaction is the start of a new ‘Time-to-Interactive’; consider optimizations in this context.
  • If client-side JavaScript isn’t benefiting the user experience, ask yourself if it’s really necessary.

The post Slow Websites appeared first on CSS-Tricks.

Complexity

Frank Chimero published a new talk-turned-essay, Everything Easy is Hard Again.

May we all be as wonderfully self-reflective and eloquent as Frank one day. There is a lot there, so please read it. Part of the theme is that web design and development has seemingly repetitive cycles that can kick even quite experienced people back down the ladder:

I don’t feel much better at making [websites] after 20 years. My knowledge and skills develop a bit, then things change, and half of what I know becomes dead weight. This hardly happens with any of the other work I do.

And complexity is to blame.

Complexity is one of those in the water topics right now. Kevin Ball wrote “we are in the midst of a massive and rapid period shift in complexity from the backend to the front.” Roneesh just celebrated untangling all this front end complexity in his own mind and shared his thoughts.

I’ve read a good number of responses to Frank’s article as well. I particularly liked our own Robin Rendle’s from the newsletter:

I believe the reason for all this complexity and nuance in the work is because we no longer expect the web to be a series of simple, hyperlinked documents.

Our expectations, as users of the web, have now completely changed.

Websites are machines in and of themselves; they have state, notifications, alerts and warnings, components that appear under certain circumstances, location-aware features and complexity beyond anything we were building fifteen years ago. There are more devices and viewport widths rendering our websites with increasingly difficult performance requirements.

That feels right to me. It’s just what I was feeling when I wrote What is the Future of Front End Web Development?

What websites are being asked to do is rising. Developers are being asked to build very complicated things very quickly and have them work very well and very fast.

Though, Frank points out that even layout, fonts, and images have ballooned in complexity. The cause there I’d argue is the rightful focus (and really, demand for) performance. But reactions to complexity are usually those things plus a dozen or two other things.

Perhaps things didn’t get complicated for no reason, and instead got complicated to compete. The web can do more, so we ask it to do more.

It’s tempting to think that the complexity comes entirely from that moreness. Embracing more of the potential of the web, playing catchup with native apps, and building more powerful tools for people. But I’m sure the whole story is more (ahem) complicated. Someone once told me the reason they think developer tooling has evolved and gotten more complicated is that developers generally aren’t asked to innovate on the business and product side. They build what they are told to, so they use their smarts to innovate on their own tools.

It depends though. A few personal examples.

I’ve been running CSS-Tricks for over a decade. It’s a garden variety WordPress site, and while it’s certainly evolved, it’s not all that much more complicated today as it was in the first few years. I’ve gotten better at working on it, in part because it’s changed so little that I’m more comfortable doing that work. I know just what all the spinning gears do and just where to put the oil, most of the time.

On the other hand, through CodePen, I’ve experienced a long product development which started fairly simple and has come to extreme complexity. We sometimes question if we’ve overdone the complexity, but for the most part, each step in that direction has been a response to make something else, ironically enough, less complicated. One example of that was the adding of React’n’Redux’n’Friends, which was a step up in the complexity of the development workflow, build, and deploy processes but, believe it or not, was a step down in a complexity of the codebase. These tools help us build faster, debug easier, stay well tested, and provide a performant experience, to name some of the benefits. We didn’t add tooling just for kicks; we added tooling because we had growing problems to address.

Not everyone has the same problems. The web is a big place is a phrase I see thrown around sometimes, and I like it. Over a billion websites, they say. A big place indeed.

Check out Dan Cederholm’s favorite website:

It’s not responsive. It’s not optimized for iPhone. It looks blurry on a Retina display. It doesn’t use the latest HTML5/CSS3 framework. It doesn’t have a thoughtful vertical rhythm. The fonts are nothing special. It is neither skeuomorphic nor flat. It doesn’t have its own favicon. It doesn’t have a native app or Twitter or Instagram. It doesn’t use AJAX or SCRUM or node.js or Sinatra. It doesn’t have an API or an RSS feed or VC funding. It hasn’t been featured on a prominent tech blog or won an award.

It tells me the soups of the day.

I suspect it doesn’t need any more complexity, and literally nobody is advocating it does. That’s why that the web is a big place sentiment is so useful. We talk about complexity, but it’s all opt-in. A wonderfully useful (and simple) website of a decade ago remains wonderfully useful and simple. Fortunately for all involved, the web, thus far, has taken compatibility quite seriously. Old websites don’t just break.

I’ll bet the bits in Frank’s essay about web layout will strike a chord to many readers of this site. Frank makes the connection between table layout and grid layout. It’s not a rare sentiment. For example:

I’m sure Frank understands the benefits of the new grid layout (e.g. try re-arranging a <table> at a media query breakpoint), but the sentiment was more about cycles than a deep technical dive.

I’d say a reasonable argument could be made that, with CSS in particular, things are easier these days. CSS of old had us biting our fingernails about cross-browser support, scattering vendor prefixes throughout our code, and (lol) saying a prayer to the box model. Eric Meyer, despite publishing a heavy tome of CSS knowledge lately, says:

CSS has a great deal more capabilities than ever before, it’s true. In the sense of “how much there potentially is to know”, yes, CSS is more of a challenge.

But the core principles and mechanisms are no more complicated than they were a decade or even two decades ago. If anything, they’re easier to grasp now, because we don’t have to clutter our minds with float behaviors or inline layout just to try to lay out a page.

Swinging it back to developers innovating on their own tools for a moment, another consideration is the impact of site builders on our industry. I’ll always recommend a site builder app in the right circumstances. Does your photography business need a portfolio site? Your bakery a homepage? Your custom scarf site a blog and eCommerce site? Assuming this is a sub-$10,000 job, I’d rather see you use a specialized site builder product than hire out the job to anyone who is going to build something entirely custom. I don’t wanna go too far down that rabbit hole, but suffice it to say, because I’m not alone in that thinking, the market for low-end custom design and development work is rather gone.

There are more developers these days working on in-house teams or agencies with big-ticket clients. That is, more and more developers on large-scope, long-scale, highly-complex jobs. So that’s where their minds are at. Big complicated problems with big complicated solutions. That’s what gets talked about. That’s what gets blogged about. That’s what gets argued about. That’s the topic at a lot of conferences I’ve been to.

While you certainly can make a soup-of-the-day website with an index.html file and FTP, blog posts about that are fewer and farther between and don’t get as many claps.


Shout out to Dave Rupert, my friend and ShopTalk Show co-host, who’s been pushing back against complexity in all its forms for as long as I’ve known him. I’m still trying to catch up.


Complexity is a post from CSS-Tricks

Careful Now

Tom Warren’s “Chrome is turning into the new Internet Explorer 6” for The Verge has a title that, to us front-end web developers, suggests that Chrome is turning into a browser far behind in technology and replete with tricky bugs. Aside from the occasional offhand generic, “Chrome is getting so bad lately,” comments you hear, we know that’s not true. Chrome often leads the pack for good web tech.

Instead, it’s about another equally concerning danger: developers building sites specifically for Chrome. In theory, that’s not really a thing, because if you build a website with web standards (of which there isn’t really much of an alternative) it’ll work in Chrome like any other modern web browser. But it is a thing if you build the site to somehow block other browsers and only allow Chrome. Warren:

Google has been at the center of a lot of “works best with Chrome” messages we’re starting to see appear on the web. Google Meet, Allo, YouTube TV, Google Earth, and YouTube Studio Beta all block Windows 10’s default browser, Microsoft Edge, from accessing them and they all point users to download Chrome instead. Google Meet, Google Earth, and YouTube TV are also not supported on Firefox with messages to download Chrome.

I wouldn’t call it an epidemic but it’s not a good trend. Best I can tell, it’s server-side UA sniffing that entirely blocks the sites:

Sheesh.

If anything, I’d think you’d just let people use the site and display a warning if you’re really worried some particular feature might not work. Or even better, fix it. I have no behind-the-scenes knowledge of why they made the choice to block certain browsers, but it’s hard to imagine a technical limitation that would force it. And if it is, I’d suggest letting it be very publicly known to incentivize the other browsers to support what is needed, assuming it’s an established standard.

Even more concerning than browser-specific websites is seeing browsers ship non-standardized features just because they want them, not behind any vendor prefix or flag. There was a time when web developers would have got out the pitchforks if a browser was doing this, but I sense some complacency seeping in.

These days, the vibe is more centered around complaining about other browsers lack of support for things. For example, one browser ships something, we see one green dot in caniuse, and we lambast the other browsers to catch up. Instead, we might ask, was it a good idea to ship that feature yet?

No modern browser is shipping vendor prefixes anymore since we all ultimately decided that was a bad idea. A side effect of that is that shipping a new feature in CSS or JavaScript is all the riskier. I would think shipping an unprefixed feature to a stable version of the browser would mean the feature is standardized and not going to significantly change. Yet, it’s been happening.

In CSS, Chrome shipped motion-* properties, but then that all changed to offset-*, and the old motion-* properties just stopped working. That’s more than just annoying, that kind of thing helps developers justify saying, “I just build this site for Chrome, if you wanna use it, use Chrome.” Fine for a demo, perhaps, but bad for the web industry as a whole. Again, I have no special insight into why this happens, I’m just a developer looking in from the outside.

Here’s another CSS one I just saw the other day. People are excited about text-decoration-skip: ink; because it looks great and helps people. They are using it a lot. But apparently, that’s not the correct name for it? It’s been changed to text-decoration-skip-ink: auto; and so Chrome 64 is canning text-decoration-skip: ink;. This stuff is hard to keep up with even while actively trying.

Chris Krycho had a take on it recently as well:

Over the past few years, I’ve increasingly seen articles with headlines that run something like, “New Feature Coming To the Web” — followed by content which described how Chrome had implemented an experimental new feature. “You’ll be able to use this soon!” has been the promise.

The reality is a bit more complicated. Sometimes, ideas the Chrome team pioneers make their way out to the rest of the browsers and become tools we can all use. Sometimes… they get shelved because none of the other browsers decide to implement them.

Many times, when this latter tack happens, developers grouse about the other browser makers who are “holding the web back.” But there is a fundamental problem in this way of looking at things: Chrome isn’t the standard. The fact that Chrome proposes something, and even the fact that a bunch of developers like it, does not a standard make. Nor does it impose an obligation to other browsers to prioritize it, or even to ship it.

This isn’t all to throw Chrome under the bus. I’m a Chrome fan. I’m sure there are examples from all the major vendors in the same vein. I’d just like my two cents to be careful now. The web is the best platform to build for and generally heading in a direction that makes that even truer. The easiest way to screw that up is not being careful with standards.


Careful Now is a post from CSS-Tricks