Uncategorized

How you must shun Partial Rendering Issues with the Server?

Where are any issues on rendering the pages on the server side, the content displayed at the user end may have some discrepancies? It is noted that partial rendering is one of the biggest problems which spoils the SEO performance of the website. With the requirement to render JavaScript on the server side, there are chances that you may not get page content rendered fully.

In fact, the rendering services may not wait for long to finish the page loading. Considering Rendertron, the dynamic rendering service of Google may not wait beyond 10 seconds. The view-all pages option is preferred by users as well as search engine crawlers while they get loaded faster. So, how you can load a content-rich page with a large number of images faster?

Service workers

Before we get into the solution, it’s ideal for reviewing the service workers and its applicability in this scenario. You can consider service workers as a kind of content delivery network running, which functions within the web browser. Content Delivery Network or CDN will help to speed up a website by effectively offloading much functionality of the given website to a network.

One major such functionality which can be offloaded is caching, which help enhance the site loading speed. However, the modern-day CDNs are capable of doing a lot more than caching, like compressing or resizing the images, block any attacks, etc. Click here to learn more about search engine optimisation.

The browser-based mini-CDN is also powerful, which can cache the content programmatically from the PWAs. In a practical scenario, it will let the application to work offline. However, what to be noted is that the service workers operate separately from the main browser, which can be used to effectively offload the processes which slow down page loading and make rendering faster.

Here is the solution:

  • You may make XHR request at the first point to get the primary list of products which return quickly.
  • Then register a service worker who intercepts such a request, create a cache, and then pass it through by making subsequent requests at the background for the other pages in the same set.
  • Once the results are loaded and fully cached, notify the page which then gets updated.

So, during the first time rendering of the page, it won’t get the results fully, but full results will be rendered in subsequent attempts.

However, there are some constraints also when you work with service workers:

  • It requires HTTPS
  • Service workers may intercept the requests at directory level and may intercept the requests for the entire site.
  • Background work may not require any DOM access.

Some tasks which could run at the background while using service workers are traversal or data manipulation, sorting, searching, and also loading data.

The above solution will also prevent any errors and timeouts from slowing down or breaking page rendering, but there could be some missing content during first-time page load. However, subsequent loads may have the latest information as getting loaded instantly from the browser cache. Rendertron also supports this rendering approach. However, as Google has now removed the Googlebot from Renderton-supported bots, you may have to add it back manually to get this solution work well.