Welcome to Centmin Mod Community
Become a Member

WebPerf 15+ Experts Share Their Web Performance Advice for 2018

Discussion in 'All Internet & Web Performance News' started by eva2000, Jun 8, 2018.

  1. eva2000

    eva2000 Administrator Staff Member

    54,901
    12,240
    113
    May 24, 2014
    Brisbane, Australia
    Ratings:
    +18,811
    Local Time:
    12:30 AM
    Nginx 1.27.x
    MariaDB 10.x/11.4+
    A couple of years ago we reached out to a number of web performance experts in the community and asked them two questions about which performance tip they would recommend focussing on and what are some common performance mistakes.


    The web performance advice they provided was top notch and extremely useful to the rest of the performance-driven community. That’s why, we wanted to reach out to these web performance experts again and get their updated insights for 2018.

    Web Performance Questions


    For this performance advice post, we focused on the current landscape of web performance. Therefore, we asked questions related to what web developers should and shouldn’t do in 2018 to improve website speed.

    This post reveals which practices some of the best web performance experts believe to be outdated in 2018 as well as their top suggestions for people looking to optimize their site in 2018.

    Experts Answers


    Note: this list is in no particular order.

    [​IMG]Stefan Judis / @stefanjudis / perf-tooling.today


    Front end developer and curator of perf-tooling.today.

    Answer


    One old practice immediately coming to mind is image spriting. The http/2 adoption today is pretty good, and as thanks to multiplexing several HTTP connections can share one TCP connection, the cost of latency decreases whereas assets can be adequately cached. Say finally goodbye to huge images sprites!

    Another outdated practice is icon fonts. Icon fonts have several downsides – they suffer from FOIT (Flash of invisible text) harming use experience, and when you’re not careful they’re very harmful to accessibility. SVG is cross-platform supported today and definitely
    the way to go!

    And the last practice coming to mind is focusing too much on metrics that are not reflecting user experience. Page load time is one of the metrics that are no telling the full story. For metrics that better reflect user experience, try looking at time to first paint, time to interactive and the by WebPageTest provided Speed Index. They are all valuable metrics that give you more information on how your users experience your site.

    At the end, web performance optimizations are always about shipping as little as possible as fast as possible. Following these two principles, I think you’ll be mainly good to go.

    Suggestions for improving web performance:

    The tooling around web performance evolved drastically last year, so my number one suggestions it to get to know the great tools like Lighthouse from Google, Sonarwhal from Microsoft and WebPageTest. They all show possible improvements while providing useful resources to find information about best practices.

    Try these tools and read, learn, improve!

    [​IMG]Chris Coyier / @chriscoyier / css-tricks.com


    Web designer and developer. Built CSS-Tricks and co-founded CodePen.

    Answer


    There are some performance best practices that are kind of a lot of work if you have to do it entirely manually, but can be done better and with little work if automated.

    Say for example you have a WordPress site. WordPress hooks you up with responsive images (<img srcset>) stuff out of the box, which is wonderful for performance. But install the Jetpack plugin, and you can, with the flip of a switch, also serve your images from the WordPress CDN (and optimize them at the same time). That’s powerful stuff for almost no effort. Another switch and you are lazy loading your images with Jetpack. Again a huge performance boon for little work.

    I love nerding out about performance tweaks you can make, but my favorites are always the ones with big impact and little effort.

    [​IMG]Peter Hedenskog / @soulislove / sitespeed.io


    Part of the performance team at Wikimedia and creator of Sitespeed – a set of open source tools that makes it easy to monitor and measure the performance of your website.

    Answer


    Make sure you don’t follow the old YSlow advice “Put JavaScripts at the Bottom”. You should always async/defer loading your JavaScript files. And never ever be dependent on JavaScript to render your page.

    Before you start optimizing your website you need to make sure you continuously measure the performance of your site (if you don’t do that already). Measure from real users (RUM) and do synthetic testing. Make sure the tools you use are GDPR compliant (if you have users that are based in the EU) so that you can continue to use them after May 25th, 2018. When you measure and feel confident in the metrics you can start to optimize.

    [​IMG]Denys Mishunov / @mishunov / mishunov.me


    Frontend developer, speaker, and author at Smashing Magazine.

    Answer


    First of all, I think there are not that many outdated practices in general. Any practice that we have in our industry has been serving some particular purpose and at some point of time, it served it well. Yes, sometimes some of such techniques or practices get outdated but when it comes to performance, any practice that we would call an “outdated” otherwise still helps make a site/project/app faster. So I can not really think of any really outdated practice right now. In general, web performance is an interesting matter in that it’s different for each particular project. For example, for a lot of projects with poor server infrastructure and HTTP/1.1, the number of server requests that are used as one of the metrics affecting performance of a website might be crucial and it could happen that doing some heavy computations on the client rather than communicating to such server turns out to give better performance results. At the same time, for projects driven with HTTP/2 this widely-spread parameter might not play as significant role due to the more requests being delivered simultaneously without blocking the page.

    That being said, even though web performance practices from the past can still do their job pretty well, we have some metrics that we should consider moving away from these days as we’re getting more descriptive and practical alternatives. Like “load” event for window, for example. Or “ DOMContentLoaded”. At some point, these were the only metrics we had but nowadays they can easily give false impression about performance because they do not take the current state of the technology into consideration. More info on the modern state of performance metrics could be got from Tammy Evert’s “The Hunt for the Unicorn Performance Metric” talk.

    In web performance, one size doesn’t necessarily fit all: you should measure your own project and your own optimizations to achieve the best result. And it might easily be that some “outdated” practice is exactly what gives you the best result.

    Suggestions for improving web performance:

    The only suggestion I would make: take it easy. I don’t mean to not bother about performance of course. I mean take a small step, measure, take another small step, measure again and so on. Never try applying several optimizations in one chunk. Otherwise, at best, you risk to not know what actually lead to performance improvement; at worst, you might improve one thing while getting regression on a couple of others and get worse performance results after all.

    Also, always start with analysis: you need to see easy solutions and more complex ones early on in the optimization process. Always begin with the easy ones before you start digging fancy techniques like server-side rendering for example: as practice shows, easy solutions like images’ optimizations, usage of Resource Hints (probably my favorite “low hanging fruit” optimization), deferring resources’ loading will get you quite far allowing you some breathing room before you get to heavyweights like Service Workers for example.

    So, when it comes to performance, small and easy steps will get you farther than you might think.

    [​IMG] Stefan Baumgartner / @ddprrt / fettblog.eu


    Web developer/web lover at Dynatrace and co-host at the German Workingdraft podcast.

    Answer


    To optimize your website for performance, there’s no way around HTTP/2. Activate HTTP/2 on your servers, then look closely how the transport of your assets has changed. Chances are that you get immediate benefits from multiplexed streams over a single TCP connection. Then start to tear your website’s resources apart. Don’t transport all at once, just deliver what’s needed on that particular page. Stop concatenating files just for the sake of saving one TCP connection. Be picky and remember: The best request is no request.

    [​IMG]Maximiliano Firtman / @firt / firt.mobi


    Mobile+web development & consulting. Author of Programming the Mobile Web & jQuery Mobile, from O’Reilly.

    Answer


    I’m not sure there are too many outdated practices, as the basics of web performance are still the same. Maybe CSS Sprites makes no sense anymore, but most of the practices – even bundling all important JS for rendering in one file- are still important even on HTTP/2. There are some techniques that are now under discussion such as the usage of LQIP (Low Quality Image Placeholders) and JPEG Compressive Images.

    Suggestions for improving web performance:

    • Analyze network opportunities: Brotli, HTTP/2, QUIC
    • Help the browser as much as possible: DNS Prefetch, Preconnect hint, Preload only on important assets
    • Code splitting for big apps, but bundling in one file per group
    • Be careful with repaints and the usage of active listeners
    • Use Reactive Web Performance to adapt the experience for different scenarios based on Client Hints and other APIs
    • Use next-generation image formats and ideas: including Zopfli compressor for PNGs, Guetzli for JPEG, WebP and videos instead of Animated GIFs among other ideas.
    [​IMG]Peter Cooper / @peterc / peterc.org


    Publisher-in-chief at CooperPress which publishes Web Operations Weekly. Software developer and code experimenter.

    Answer


    Outdated practices:

    • Stop trying to make bad features faster, and focus on getting rid of features entirely. Your pages and apps often don’t need to be as heavy as they are.
    • The speed of the underlying language behind your services is a secondary consideration nowadays.
    • A lot of classic JavaScript micro-optimizations have become obsolete as engines continue to improve.
    • Resource inlining, asset concatenation, and splitting assets across subdomains are less important than ever due to HTTP/2.

    Suggestions for improving web performance:

    • Get on top of HTTP/2, particularly if your pages/apps have to remain complex in terms of assets, scripts, etc.
    • See if you can go “static”. The result will be much faster load times (even if it’s at the expense of longer build times) and you’ll find it easier to use global CDNs, aggressive caching, and follow other performance best practices.
    [​IMG]Dean Hume / @DeanoHume / deanhume.com


    Software developer. Author of Fast ASP.NET Websites, a book aimed at improving the performance of high transaction websites.

    Answer


    This is going to sound a little crazy, but I don’t really believe there are many (if any) outdated web performance practices in 2018. If you think back to Steve Souders amazing book High Performance Web Sites and his 14 rules for faster loading web sites, all of those rules are still applicable today. It’s kinda reassuring to know that these rules have stood the test of time! While we have some newer techniques, these original rules are still the foundations.

    Suggestions for improving web performance:

    • Get a service worker on your site! With a few lines of code, you can have caching in place that will produce lightning fast response times. The best part is – even if a user visits your site and their browser doesn’t support them, it will simply fall back. It’s a no brainer!
    • If you haven’t already done so, upgrade your site to use HTTP/2. With this in place, HTTP/2 uses multiplexing that allows your Browser to fire off multiple requests at once on the same connection and receive the requests back in any order. Compared to HTTP1.1, this means no multiple connections and a much faster response times.
    • If your server supports Brotli – I highly recommend enabling it. Brotli is a compression algorithm that compresses data and can be more effective than GZIP and Deflate in certain circumstances. Zopfli is a good solution for resources that don’t change much and are designed to be compressed once and downloaded many times. In fact, at settled.co.uk, we noticed a 10% improvement in the size of the files that we compressed using Brotli.
    • Use responsive images! According to the HTTP archive, images make up around 50% of the average web page. By using responsive images, you can tailor the image sizes to suit the browser’s viewport and in turn, save on the total download size of your web pages. You can save your users bandwidth and ensure speedy response times at the same time!
    [​IMG]Jem Young / @JemYoung / jemyoung.com


    Software engineer at Netflix and recent speaker at the #PerfMatters conference on the topic of “Modern Performance in the Year of the Dog”.

    Answer


    Stop thinking a framework or library will solve your performance problems. Measure, identify problem areas, and address any issues. True performance comes from taking a holistic view of your website and empathizing with your user base.

    Start treating mobile as a first-class citizen. The next 4 billion joining the internet will on a mobile device so when you think about performance, think mobile performance.

    [​IMG]Léonie Watson / @LeonieWatson / tink.uk


    Accessibility engineer, W3C Web Platform WG co-chair, and recent speaker at the #PerfMatters conference on the topic of “There’s more to performance than meets the eye”

    Answer


    More things than you think affect Time To Interaction (TTI). When someone is running an Assistive Technology (AT) like a screen reader, browsers behave differently and this has an impact on performance. As well as creating the DOM, the browser creates an accessibility tree and, depending on the browser, gives the screen reader different ways to access the information in the accessibility tree and present that information to the user. With large pages, this can add several seconds to the TTI for screen reader users, even in the latest versions of Chrome and Firefox.

    [​IMG]Brian Jackson / @brianleejackson / kinsta.com


    CMO at Kinsta, blogger at Woorkup, and developer of the perfmatters WordPress performance plugin.

    Answer


    Even in 2018, I still see many users combining their JavaScript and CSS. In a lot of cases, this is no longer needed due to most sites are now running over HTTPS (or should be) and utilizing HTTP/2 which has better support for parallelism. In fact, combining things like this can result in a slower site.

    Another outdated practice or misconception I see a lot is thinking a CDN won’t have a huge impact on the performance of your site and it’s something you’ll get to eventually. If you’re serving visitors globally a CDN is essential to help speed up the delivery of your assets (depending on the locations I’ve seen up to 70% decrease in load times). A CDN is not optional in 2018, it should be a normal part of a web developers stack.

    Developers need to stop just including external scripts or services on a whim and determine just how much each might impact the performance of a site. Take Font Awesome for example. While it’s probably one of the most popular ways to easily include font icons on your site, it’s better to determine which icons you’re actually using. If your site is only utilizing 10 out of the hundreds of icons… repackage your icon fonts with a tool like IcoMoon instead. I’ve seen this drop the size of the file down by over 90%!

    In other words… don’t just grab the CDN hosted script because it’s the popular thing to do. Take a few moments and determine if that is the best way. A lot of times, hosting something on your own CDN is better as it reduces another DNS lookup and you’ll have more control over caching of the file, etc. If you take a performance-driven approach to these things it can quickly result in a much faster site.

    Suggestions for improving web performance:

    1. Stop trying to save money by going with cheap web hosting. A lot of times when it comes to hosting you get what you pay for. Your hosting provider is the backbone of your site and probably plays one of the most important roles in just how fast your site will load. If you’d rather spend your time growing your business and site, go with a managed host, especially if you don’t have the knowledge to troubleshoot server-side related issues.
    2. If you’re using PHP, upgrade as fast as you can to the latest versions. Not only will PHP 5.6 be EOL this year in terms of support and security, but PHP 7.2 has been shown to handle up to 3x as many requests. Check out these PHP and HHVM benchmarks. It’s also important to note that in regards to WordPress, HHVM is no longer being supported.
    3. Optimize your images. This might sound like a broken record to many, but in 2018, 50% of an average web page’s weight is still made up of images. Finding a good balance of compression and quality is essential for every website.
    4. If you’re struggling with WordPress performance you may need to go beyond the typical troubleshooting steps. I recommend utilizing amazing enterprise tools like New Relic which can help pinpoint slow queries or bad code. Check things such as autoloaded data in your wp_options table, corrupt transients, cache-hit ratios, CRON jobs, etc. I can’t tell you how many times I’ve seen corrupt transients and large wp_options tables with unnecessary autoloaded data bring WordPress sites to a crawl.

    There is also great software which can help larger and more dynamic WordPress sites such as WooCommerce, community sites, etc. Redis, in terms of caching, allows for the reuse of cached objects rather than requiring the MySQL database to be queried a second time for the same object. This can help decrease load time on the database. ElasticSearch is another one that helps speed up WordPress search by providing an additional indexed layer which is quicker to search than a MySQL query against the database.

    [​IMG]Anselm Hannemann / @helloanselm / helloanselm.com


    Front-end developer creating solid, scalable code architectures. Curator of the amazing WDRL Newsletter.

    Answer


    While I’m not sure if this fits ‘outdated’ very well, I think it’s important to highlight that you shouldn’t focus on specific parts of your websites to be fast but on the overall experience. I see a lot of services that render incredibly fast once loaded due to virtual DOM and other cool techniques but the same pages take several seconds to load the initial layout due to the heavy render-blocking JavaScript application that they serve. Instead, we should try to use code-splitting and serve a very small bundle for the initial application experience and asynchronously load additional features later on.

    Suggestions for improving web performance:

    First of all, I think one of the most important yet often forgotten things to optimize the page load performance is caching. Especially with immutable cache we can speed up the rendering of a page for recurring visitors noticeably with relatively low effort.

    Second to that, serving assets as quickly as possible is key to good performance. Whether you have a JavaScript-driven application of HTML document, you should indicate downloadable assets such as your scripts, images, and CSS via HTML preload elements or even consider HTTP/2 Server Push and for this to ensure that the most important content to render the initial experience is served as quick as possible.

    Third, refactor your code and leave out everything you don’t need. This might include libraries or some over-engineered code you wrote previously. You might even want to consider not transpiling your JavaScript anymore depending on what browsers you need to support.

    [​IMG]Sergey Chernyshev / @sergeyche / sergeychernyshev.com


    Web technologist with passion for web performance and open source. Organizer of Meetup: NY Web Performance and WebPerfDays NY.

    Answer


    Don’t use technical metrics to represent real user’s experience.

    The most common these days is to measure so-called “Page Load” time calculating time from the start of navigation to the time when browser’s onLoad event fires. Unfortunately, this event as well as many other, “technical” events inside the browser only represent the inner-workings of the page, but not experience real users have at that point with because page is often in an incomplete state not meaningful to the user.

    This applies to other techincal metrics currently used, from Time To First Byte (TTFB) which measures time to generate the page on the server and even Time To First Paint which is a bit better as it represents when page stops being completely unusable for the user, but still does not really tell us when experience is useful.

    There are some automatable metrics emerging like Time To First Contentful Paint (FCP) which attempts to measure when parts of the DOM are first painted and Time To Interactivity (TTI) which attempts to represent when browser’s CPU is idle enough after First Contentful Paint for user to start scrolling and interacting with the page.

    These metrics are better automated metrics that tools can capture without developer involvement and that’s why you see more information on them as tool providers like Google, for example, can work with them in bulk without working with individual sites.

    The best option, however is for you as a developer of a specific site is to measure specific events in your application like the “Time to first Tweet” on Twitter or a bit more generic “Pinner wait time” on Pinterest. Metrics like these will let you track user experience as it relates to your product and business KPIs that you are trying to improve.

    You can use W3C UserTiming API and hopefully in the future, upcoming Element Timing API (supported only by SpeedCurve in their synthetic tools) to report what matters on your site and track that in relation to your performance efforts.

    Suggestions for improving web performance:

    At this point in time, so many technical solutions and tricks exist to get your site to work faster, so many frameworks and technical approaches to make that happen – from traditional, server-side rendered pages with inline critical CSS and asynchronously loaded JS to Single-page Applications (SPA) that use browser history API to update URL and contents of the page without full deconstruction and reconstruction of DOM on each navigation and all the way to Progressive Web Apps (PWAs) that utilize Service Workers to progressively enhance your pages caching application shell and data locally only requesting data changes over network if necessary.

    All of those are commendable methods of speeding up the technology, but in my opinion the main solution to slowness lies outside the technical realm itself and needs to solve the problem of overall disregard of performance in product development lifecycle which leaves it until the product is built and resulting app is slow and disappointing requiring post-development optimization.

    The only way for us to change our ways is to start “Designing Speed” the same way we design visual aspects of user experience and branding and the way we design technical solutions for new features.

    Only by making speed a first-class citizen in product and technology conversations can we make sure is built into each product. This is extremely important because unlike other, functional features, speed is much harder to add or remove after development is done and cost of rework is sometimes prohibitively high.

    This is not unlike responsive design process, which made us all see sense in regards to multi-device support after we neglected it for so long or accessibility initiatives which are, unfortunately, still quite neglected by our industry. As Tammy Everts pointed out in her talk about hunting the performance metric unicorn we need to scale empathy within our organization and to do so, it is critical to inject speed design into organization’s workflow, together with general education with web performance.

    There are not that many resources available yet about it, but you can read my article where I propose “Progressive Storyboards” as visual technique and check out speedpatterns.com, a new and upcoming catalog of speed design patterns (feel free to contribute as it is being built).

    [​IMG]Aaron Gustafson / @AaronGustafson / aaron-gustafson.com


    Web standards advocate at Microsoft. Author of Adaptive Web Design.

    Answer


    In 2018, I’m hopeful that most developers have made the switch from icon fonts to SVG icons. Not only are SVGs far smaller, they’re also more flexible and better supported than icon fonts.

    My top performance-related suggestions for 2018 are really the same ones I’ve been harping on since we were on dial-up:

    1. Get rid of any extraneous/unnecessary images, scripts, and CSS. Make them fight for their place in your pages.
    2. Optimize your images! First, choose the right format. Provide more performant alternatives like WebP to browsers that support them. Scale them to the sizes you need. Use a tool (or three) to compress the heck out of them. Mark them up as adaptive images (picture and/or srcset and sizes as appropriate) to deliver the best, smallest image possible to your users.
    3. Minify your text-based assets, including your HTML, as much as possible. If you can, pre-compress your files using both Gzip and Zopfli compression to improve server-side performance.
    4. Concatenate common resources. Even though HTTP2 streams files faster, combining commonly-referenced files like you global CSS and JavaScript is still a good idea.

    One final suggestion that’s new is to add a Service Worker that enables you to more elegantly handle caching and speed up page rendering. For browsers that support this new worker type (which is all current browser versions), the performance benefits to your users will be huge.

    [​IMG]Vitaly Friedman / @smashingmag / smashingmagazine.com


    Co-founder of Smashing Magazine, a leading online magazine dedicated to design and web development.

    Answer


    I think at this point it’s critical to look into the possibilities of what you can achieved with service workers for advanced performance optimization, but in terms of low hanging fruits, I’d definitely explore optimization of web font loading and deferring/dealing with 3rd-party-scripts. Most of the time they are slowing down the entire experience massively.

    Beyond that, obviously examining and deferring JavaScript in general, with code splitting via Webpack, for example, has become extremely important for every website with heavy JavaScript bundle. You might find some useful ideas in a checklist I published recently.


    [​IMG]Harry Roberts / @csswizardry / csswizardry.com


    Consultant Front-end Architect: Google, UN, BBC, Kickstarter, Etsy.

    Answer


    There are some very specific things that I would consider anti-patterns that developers should avoid doing, but they’re not particularly specific to 2018. The first two both pertain to loading JavaScript in ways that were previously thought to be beneficial to performance, but actually happen to be a net loss.

    The first is using document.write to instantiate a new script. This blocks the parser for up to units of seconds at a time. Chrome has already begun intervening on slow connections and are effectively blocking its usage:

    Based on instrumentation in Chrome, we’ve learned that pages featuring third-party scripts inserted via document.write() are typically twice as slow to load than other pages on 2G.

    The second is the use of asynchronous JavaScript snippets to load subsequent files. This is still a surprisingly common practice—even used by Google Analytics—despite better alternatives being available. The issue with a snippet like the one below is that the reference to Google Analytics is actually just a JavaScript string, not an actual reference to a file. This means that the URL is not discovered until the browser has parsed and executed this script block. If the URL was in a regular script tag’s src attribute, then the browsers secondary parser—the lookahead pre-parser, or preload scanner—could find the reference much sooner, making everything much faster.

    <script>
    (function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){
    (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o),
    m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m)
    })(window,document,'script','https://www.google-analytics.com/analytics.js','ga');

    ga('create', 'UA-xxxxxxx-x', 'auto');
    ga('send', 'pageview');
    </script>

    Both of these are oddly specific, but I do wish people would stop using them: their time is done.

    Suggestions for improving web performance:

    • Move over to HTTP/2: It currently enjoys about 80% global support.
    • Begin thinking about offline: ServiceWorker is wonderful, and it doesn’t need to be a complex addition.
    • Consider the next billion users: there are lots of people coming online who don’t enjoy the same connectivity that we enjoy in the West.
    Summary


    In conclusion, there are a variety of different ways you can optimize your website’s performance in 2018. However, a few key suggestions noted by multiple web performance experts above include:

    • Use HTTP/2
    • Implement SVG icons instead of icon fonts
    • Cache your static content
    • Add service workers

    Thanks to all of the web performance experts who participated! Taking this advice and implementing best practices or removing outdated ones on your own site is a step towards a faster web experience for all. Let us know in the comments section below what you found was the best piece of advice.

    Related Articles


    The post 15+ Experts Share Their Web Performance Advice for 2018 appeared first on KeyCDN Blog.

    Continue reading...
     
    Last edited: Jun 8, 2018
  2. eva2000

    eva2000 Administrator Staff Member

    54,901
    12,240
    113
    May 24, 2014
    Brisbane, Australia
    Ratings:
    +18,811
    Local Time:
    12:30 AM
    Nginx 1.27.x
    MariaDB 10.x/11.4+
    yup WPT and Speedindex WebPerf - PageSpeed - How to use webpagetest.org for page load speed testing :)

    ah learned of a new tool sonarwhal, a linting tool for the web :)

    Wish folks would also mention different implementations of Brotli on different web servers, have different performance and to highlight page load speed vs scalability. Benchmarks done for Centmin Mod Nginx brotli vs gzip and Cloudflare's brotli vs gzip at here and here.

    pre-compressed gzip and even brotli perform way better than on the fly compression - benchmarks here Nginx - Nginx with Cloudflare zlib fork VS nxg_brotli compression level tests

    still need to learn and read up about this :)