Back to the Basics
Before we delve into the details for optimizing your website to get indexed appropriately, let’s brush our hands on some basic terminology.
- HTML- Hypertext Markup Language acts as a content organizer that provides a website structure including bullet lists, headlines, paragraphs and so forth.
- SEO- The term Search Engine Optimization includes everything from a well-structured website to choosing relevant keywords, right tagging, technical standards and of course, amazing content.
There are tons of overlapping here, of course, but these fundamentals can directly impact your site’s crawl ability and search ability.
Understanding the SEO: Crawling. Indexing. Ranking
Crawling, a process where Google discovers your site. The crawlers start by picking up web pages and then follow the page links, You can help Google and tell the crawler which pages to crawl and which not to crawl. A “robots.txt” file says
“Customer-side analytics may not provide a complete or accurate description on your platform of Googlebot and WRS operation. Use the Search Console to track activity and reviews on your website from Googlebot and WRS” – Google Developers
The crawler sends what it finds to the indexer and also prioritizes the URLs based on their high value. Once this stage is over and no errors have occurred in the Search Console, the ranking process should begin. Also, this is when SEO specialists need to take a plunge by offering quality content, upgrading the site to acquire more valuable links.
- functions outside the href Attribute-Value Pair (AVP) but within the tag
- functions outside the “a” but named inside the href AVP.
The exciting thing about this is that Google can make JS pretty decent. Googlebot has a fixed time to wait for JS framing, which you wouldn’t take for granted.
- Server-side rendering– As the name implies, here the code the one which is rendered server-side has to face Google to crawl and index. The biggest advantage here is it can render the client-side content without requiring the second pass.
- Hybrid rendering- This phase is all about balancing client and server-side rendering to enhance in regards to user experience. For critical page sections, use server-side rendering, such as initial page layout and primary content, for non-critical page components, use client-side rendering.
- Dynamic rendering- According to the recent change in Google policy, the page server-side for Googlebot as well as end-users can be rendered. Many web developers in the tech fraternity call it cloaking. However, the search engine seems to have revised its guidelines to allow it in this situation. If you have a large site that keeps on changing now and then, dynamic rendering turns out to be the only choice. where content could be stale by the time Google takes its second pass to frame it.