Here’s a wonderful post by Eric Higgins all about refactoring and technical debt. He compares giant refactoring projects to being similar to Tetris:
Similar to running a business, Tetris gets harder the longer you play. Pieces move faster and it becomes harder to keep up.
Similar to running a business, you can never win Tetris. There is no true finish line. You only control how quickly you lose.
Similar to running a business, allowing too many gaps to build up in Tetris will cause you to lose.
I love this comparison, despite my mediocre Tetris skills. It does feel like even “easy” development becomes harder as technical debt grows on a project, much the same way Tetris pieces gain speed and provide little time to react as the stack grows. However, I do think perhaps I have a more optimistic view of technical debt overall. If you work slowly and carefully then you can build up a culture of refactoring and gather momentum over time.
There is no one true way to hide something on the web. Nor should there be, because hiding is too vague. Are you hiding visually or temporarily (like a user menu), but the content should still be accessible? Are you hiding it from assistive tech on purpose? Are you showing it to assistive tech only? Are you hiding it at certain screen sizes or other scenarios? Or are you just plain hiding it from everyone all the time?
In 2015 there were 24,000 different Android devices, and each of them was capable of downloading images. And this was just the beginning. The mobile era is starting to gather pace with mobile visitors starting to eclipse desktop. One thing is certain, building and running a website in the mobile era requires an entirely different approach to images. You need to consider users on desktops, laptops, tablets, watches, smartphones and then viewport sizes, resolutions, before, finally, getting to browser supported formats.
So, what’s a good image optimization option to help you juggle all of these demands without sacrificing image quality and on-page SEO strategy?
If you’ve ever wondered how many breakpoints a folding screen will have, then you’ll probably like Optimole. It integrates with both the WordPress block editor and page builders to solve a bunch of image optimization problems. It’s an all-in-one service, so you can optimize images and also take advantage of features built to help you improve your page speed.
What’s different about Optimole?
The next step in image optimization is device specificity, so the traditional method of catch and replace image optimization will no longer be the best option for your website. Most image optimizers work in the same way:
Select images for optimization.
Replace images on the page.
Delete the original.
They provide multiple static images for each screen size according to the breakpoints defined in your code. Essentially, you are providing multiple images to try and Optimized? Sure. Optimal? Hardly.
Optimole works like this:
Your images get sucked into the cloud and optimized.
Optimole replaces the standard image URLs with new ones – tailored to the user’s screen.
The images go through a fast CDN to make them load lightning fast.
Here’s why this is better: You will be serving perfectly sized images to every device, faster, in the best format, and without an unnecessary load on your servers.
A quick case study: CodeinWP
Optimole’s first real test was on CodeinWP as part of a full site redesign. The blog has been around for a while during which time is has emerged as one of the leaders in the WordPress space. Performance wise? Not so much. With over 150,000 monthly visitors, the site needed to provide a better experience for Google and mobile.
Optimole was used as one part of a mobile first strategy. In the early stages, Optimole helped provide a ~0.4s boost in load time for most pages with it enabled. Whereas, on desktop, Optimole is compressing images by ~50% and improving pagespeed grades by 23%. Fully loaded time and the total page size is reduced by ~19%.
Improved PageSpeed and quicker delivery
Along with a device-specific approach, Optimole provides an image by image optimization to ensure each image fits perfectly into the targeted container. Google will love it. These savings in bandwidth are going to help you improve your PageSpeed scores.
It’s not always about the numbers; your website needs to conform to expected behavior even when rendering on mobile. You can avoid content jumping and shifting with any number of tweaks but a container based lazy loading option provides the best user experience. Optimole sends a blurred image at the exact container size, so your visitors never lose their place on the page.
We offer CDNs for both free and premium users. If you’re already using CDN, then we can integrate it from our end. The extra costs involved will be balanced out with a monthly discount.
Picking the perfect image for every device
Everyone loves <srcsets> and <sizes> but it is time for an automated solution that doesn’t leak bandwidth. With client hints, Optimole provides dynamic image resizing that provides a perfectly sized image for each and every device.
Acting as a proxy service allows Optimole to deliver unique images based on supported formats. Rather than replace an image on the page with a broad appeal, Optimole provides the best image based on the information provided by the browser. This dynamism means WebP and Retina displays are supported for, lucky, users without needing to make any changes.
Optimole can be set to detect slower connections, and deliver an image with a high compression rate to keep the page load time low.
Industrial strength optimization with a simple UI
The clean and simple UI gives you the options you need to make a site; no muss no fuss. You can set the parameters, introduce lazy loading, and test the quality without touching up any of the URLs.
You can reduce the extra CSS you need to make page builder images responsive and compressed. It currently takes time and a few CSS tricks to get all of your Elementor images responsive. For example, the extra thumbnails from galleries and carousels can throw a few curve balls. With Optimole all of these images are picked up from the page, and treated like any other. Automatically.
One of the reasons to avoid changing image optimization strategies is the, frightening, prospect of going through the optimization process again. Optimole is the set-and-forget optimization option that optimizes your images without making changes to the original image. Just install, activate and let Optimole handle the rest.
If you like what you see then you can get a fully functional free plan with all the features. It includes 1GB of image optimization and 5GB of viewing bandwidth. The premium plans start with 10GB of image optimization, and 50GB of viewing bandwidth, plus access to an enhanced CDN including over 130 locations.
If you’re worried about making a big change, then rest assured Optimole can be uninstalled cleanly to leave your site exactly as before.
This is a fantastic post by Ahmad Shadeed. It digs into the practical construction of a header on a website — the kind of work that many of us regularly do. It looks like it’s going to be fairly easy to create the header at first, but it starts to get complicated as considerations for screen sizes, edge cases, interactions, and CSS layout possibilities enter the mix. This doesn’t even get into cross-browser concerns, which have been somewhat less of a struggle lately, but is always there.
It’s sometimes hard to explain the interesting and tricky situations that define our work in front-end development, and this article does a good job of showcasing that.
These two illustrations set the scene really well:
That’s not to dunk on designers. I’ve done this to myself plenty of times. Plus, any designer worth their salt thinks about these things right off the bat. The reality is that the implementation weirdness largely falls to the front-end developer and the designer can help when big choices need to be made or the solutions need to be clarified.
HLS stands for HTTP Live Streaming. It’s an adaptive bitrate streaming protocol developed by Apple. One of those sentences to casually drop at any party. Äh. Back on track: HLS allows you to specify a playlist with multiple video sources in different resolutions. Based on available bandwidth these video sources can be switched and allow adaptive playback.
This is an interesting journey where the engineering team behind Kitchen Stories wanted to switch away from the Vimeo player (160 kB), but still use Vimeo as a video host because they provide direct video links with a Pro plan. Instead, they are using the native <video> element, a library for handling HLS, and a wrapper element to give them a little bonus UX.
This video stuff is hard to keep up with! There is another new format called AV1 that is apparently a big deal as YouTube and Netflix are both embracing it. Andrey Sitnik wrote about it here:
That doesn’t even mention HLS, but I suppose that’s because HSL is a streaming protocol, which still needs to stream in some sort of format.
I wouldn’t say the term “CSS algorithm” has widespread usage yet, but I think Lara Schenck might be onto something. She defines it as:
a well-defined declaration or set of declarations that produces a specific styling output
So a CSS algorithm isn’t really a component where there is some parent element and whatever it needs inside, but a CSS algorithm could involve components. A CSS algorithm isn’t just some tricky key/value pair or calculated output — but it could certainly involve those things.
The way I understand it is that they are little mini systems. In a recent post, she describes a situation involving essentially two fixed header bars and needing to deal with them in different situations. In this example, the page can be in different states (e.g. a logged-in state has a position: fixed; bar), and that affects not only the header but the content area as well. Dealing with all that together is a CSS algorithm. It’s probably the way we all work in CSS already, but now have a term to describe it. This particular example involves some CSS custom properties, a state-based class, two selectors, and a media query. Classic front-end developer stuff.
We’ve been writing a lot about refactoring CSS lately, from how to take a slow and methodical approach to getting some quick wins. As a result, I’ve been reading a ton about this topic and somehow stumbled upon this post by Harry Roberts about refactoring and how to mitigate the potential risks that come with it:
Refactoring can be scary. On a sufficiently large or legacy application, there can be so much fundamentally wrong with the codebase that many refactoring tasks will run very deep throughout the whole project. This puts a lot of pressure on developers, especially considering that this is their chance to “get it right this time”. This can feel debilitating: “Where do I start?” “How long is this going to take?” “How will I know if I’m doing the right thing?”
Harry then comes up with this metaphor of a refactoring tunnel where it’s really easy to find yourself stuck in the middle of a refactor and without any way out of it. He argues that we should focus on small, manageable pieces instead of trying to tackle everything at once:
Resist the temptation to refactor anything that runs right the way throughout the project. Instead, identify smaller and more manageable tasks: tasks that have a much smaller surface area, and therefore a much shorter Refactoring Tunnel.
These tasks can still aim toward a larger and more total goal but can be realised in much safer and shorter timeframes. Want to move all of your classes from BEM to BEM(IT)? Sure, but maybe just implement it on the nav first.
This way feels considerably slower, for sure, but there’s so much less risk involved.
Anyone on a team can work on styling a component without any fear of unintended side effects.
There is no pressure to come up with perfect names that will work now and forever.
There is no worry about the styles needing to be extremely re-usable or that they play friendly with anything else. These styles will only be used when needed and not any other time.
There is an obvious standard to where styles are placed in the codebase.
There are some reasons why I don’t buy into it. Performance is one of them, like choosing CSS-in-JS is some automatic performance win. Part of the problem (and I’m guilty of doing it right here) is that CSS-in-JS is a wide scope of solutions. I’ve generally found there is no big performance wins in CSS-in-JS (more likely the opposite), but that’s irrelevant if we’re talking about something like CSS modules with the styles extracted and linked up like any other CSS.
I like Adam Laki’s Quick Tip: CSS Triangles because it covers that ubiquitous fact about front-end techniques: there are always many ways to do the same thing. In this case, drawing a triangle can be done:
with border and a collapsed element
with clip-path: polygon()
with transform: rotate() and overflow: hidden
with glyphs like ▼
I’d say that the way I’ve typically done triangles the most over the years is with the border trick, but I think my favorite way now is using clip-path. Code like this is fairly clear, understandable, and maintainable to me: clip-path: polygon(50% 0, 0 100%, 100% 100%); Brain: Middle top! Bottom right! Bottom left! Triangle!
My 2nd Place method goes to an option that didn’t make Adam’s list: inline <svg>! This kind of thing is nearly just as brain-friendly: <polygon points="0,0 100,0 50,100"/>.