The Impact of JavaScript on SEO and Optimization Methods

SEO analysis and issues

Analyzed and diagnosed the SEO of several sites, the most obvious issue is that they are implemented through Wordpress. Both front-end performance and internal optimization such as code simplification are done poorly. Especially the use of JavaScript on the website, JS is becoming increasingly important in SEO optimization in recent years. Whether it's asynchronous loading or adding page interactions, websites nowadays can hardly avoid using JS scripts. Improper use of JavaScript can have a huge impact on SEO.

Search engine handling of JS

Simple explanation of Google crawling steps. When traditionally crawling HTML pages, Google bot downloads the HTML pages, extracts URL addresses from the source code, and rapidly visits these URLs. Then downloads CSS files, sends the downloaded resources to Google's Indexer, and indexes the pages. When crawling website content generated by JavaScript, Google bot downloads the HTML pages, but cannot find links in the source code because JavaScript is not executed. Then downloads CSS and JavaScript files, uses WRS (renderer, a part of the Indexer) to parse, compile, and execute JavaScript, fetches data from external APIs, databases, Indexer can index the content. Finally, Google discovers new links and adds them to the crawling queue.

Do not use JS for important links.

Search engine crawling and page fetching rely on tracking links. If important links require JS scripts to be executed or parsed, search engines might not be able to track them. This is not to say that links cannot be called using JS, and often, the common footer of a website is implemented using JavaScript, but important pages that are intended to be indexed should at least have the most basicProvide format links for crawling entry.

Use lazy loading and waterfall with caution.

Lazy loading images, and even lazy loading text content, is a method frequently used by many websites to improve page speed to a certain extent. However, it is important to note that when implementing lazy loading with JavaScript, whether user interaction is required to load, such as clicking on a 'more' link, or scrolling down the page, search engine spiders do not perform these actions, so they may not see the content loaded lazily. Whether lazy loading more content on the same page or more content from other page lists, it may cause crawling and indexing issues. Many websites opt for using waterfall layout on list pages, which users are already accustomed to, primarily due to learning from social media displays. However, social media sites often proactively block search engines and do not want their content to be indexed by search engines. If you want your website content to be fully indexed, try to avoid using waterfall layout. If you do use waterfall layout, design other entry points that provide search engines with links to track.

Webpage opening and response speed

Usually using JS can slow down page opening and response speed. Downloading JS files is not a big problem, but the browser executing JS may consume a lot of device resources, JS may take several seconds to execute, which may cause script blocking, this is a quite bad experience for users. Therefore, try to delete JS that takes too long to execute, scripts that take longer than 1-2 seconds need to be carefully considered. Try to merge JS scripts to reduce