The Benefits of Dynamic Rendering for SEO


JavaScript frameworks have grown in popularity over the past few years, largely due to the flexibility they offer. “JavaScript frameworks enable rapid development. It provides better user experience. It offers better performance and enhanced features that are somehow lacking in traditional – non-JavaScript frameworks,” said Nati Elimelech, Technical SEO Manager at Wix.

“So it’s no surprise that very large websites or complex user interfaces with complex logic and functionality generally tend to use JavaScript frameworks these days,” he added.

AT SMX NextElimelech provided insight into how JavaScript works for client-side, server-side, and dynamic rendering, and shared audit insights gained from implementing JavaScript on over 200 million websites.

Client-side or server-side rendering

Different rendering methods are suitable for different purposes. Elimelech made the case for dynamic rendering as a means of satisfying search engine crawlers and users, but first it is necessary to understand how client-side and server-side rendering work.

Client side rendering

When a user clicks on a link, their browser sends requests to the server where the site is hosted.

“When we talk about JavaScript frameworks, this server responds with something a little different than what we’re used to,” Elimelech said.

“It responds with skeleton HTML – just the basic HTML, but with lots of JavaScript. Basically what it does is tell my browser to run the JavaScript itself to get all the important HTML” , he said, adding that the user’s browser then produces the rendered HTML (the final HTML used to build the page the way we actually see it.) This process is known as side rendering. customer.

Image: Natie Elimelech.

“It’s kind of like assembling your own furniture because the server says to the browser, ‘Hey, these are all the parts, these are the instructions, build the page. I trust you.’ And that means all the hard work is moved to the browser instead of the server,” Elimelech said.

Client-side rendering can be great for users, but there are cases where a client isn’t running JavaScript, which means they won’t get the full content of your page. One such example may be search engine crawlers; although Googlebot can now see more of your content than ever before, there are still limitations.

Server-side rendering

For clients not running JavaScript, server-side rendering can be used.

“Server-side rendering is when all that JavaScript is executed server-side. All resources are required server-side and your browser and search engine bot don’t need to execute JavaScript to get the fully rendered HTML This means server-side rendering can be faster and less resource-intensive for browsers.

A slide with a basic explanation of server-side rendering.
Image: Natie Elimelech.

“Server-side rendering is like providing your guests with an actual chair to sit on instead of having to assemble it,” he said, continuing his earlier analogy. “And, when you render server-side, you’re basically making your HTML visible to all kinds of bots, all kinds of clients. . . No matter how capable JavaScript is, it can see the important final rendered HTML,” he said. -he adds.

Dynamic rendering

Dynamic rendering represents “the best of both worlds,” Elimelech said. Dynamic rendering means “switching between client-side rendered content and pre-rendered content for specific user agents,” according to Google.

Below is a simplified diagram explaining how dynamic rendering works for different user agents (users and bots).

A flowchart describing dynamic rendering.
Image: Natie Elimelech.

”So there is a request to URL, but this time we check: Do we know this user agent? Is it a known robot? Is it Google? Is it Bing? Is it Semrush? Is it something we know about? If it’s not, we assume it’s a user and then we render on the client side,” Elimelech said.

In this case, the user’s browser runs the JavaScript to get the HTML rendering, but still gets the benefits of client-side rendering, which often includes a perceived speed boost.

On the other hand, if the client is a bot, server-side rendering is used to serve fully rendered HTML. “So he sees everything that needs to be seen,” Elimelech said.

This represents the “best of both worlds” because site owners are still able to deliver their content, regardless of the client’s JavaScript capabilities. And, because there are two streams, site owners can optimize each to better serve users or bots without affecting the other.

But the dynamic rendering is not perfect

There are, however, complications associated with dynamic rendering. “We have two streams to maintain, two sets of logic, caching, other complex systems; so it’s more complex when you have two systems instead of one,” Elimelech said, noting that site owners also need to maintain a list of user agents in order to identify bots.

The pros and cons of dynamic rendering
Image: Natie Elimelech.

Some might worry that serving search engine crawlers something different from what you show users could be considered a cover-up.

“Dynamic rendering is actually a preferred and recommended solution by Google because what matters to Google is if the important things are the same [between the two versions]”said Elimelech, adding that” the “important things” are the things that interest us as SEOs: the content, the headers, the meta tags, the internal links, the navigation links, the robots, the title, canonical and structured data markup. , content, images — anything to do with how a bot would react to the page. . . it’s important to keep them the same and when you keep them the same, especially the content and especially the meta tags, Google has no problem with that.

Potential site parity issues when using different JavaScript rendering methods
Image: Natie Elimelech.

Since it is necessary to maintain parity between what you serve to bots and what you serve to users, it is also necessary to audit for issues that could break that parity.

To audit potential issues, Elimelech recommends Screaming Frog or a similar tool that lets you compare two crawls. “So what we like to do is crawl a website as a Googlebot (or some other search engine user agent) and crawl it as a user and make sure there’s no no differences,” he said. Comparing the relevant items between the two scans can help you identify potential issues.

A slide with tools to audit your site's javascript versions.
Image: Natie Elimelech.

Elimelech also mentioned the following methods to detect problems:

“Remember that JavaScript frameworks aren’t going anywhere,” he said. “There’s a good chance you’ll run into one soon, so you better be prepared to handle them.”

Watch the full SMX Next overview here (free registration required).

New to Search Engine Land

About the Author

George Nguyen is an editor for Search Engine Land, covering organic search, podcasting and e-commerce. He has a background in journalism and content marketing. Prior to entering the industry, he worked as a radio personality, writer, podcast host, and public school teacher.


Comments are closed.