two students of simba academy conducting technical seo experiments after learning Technical SEO Tips from Simba
two students of simba academy conducting technical seo experiments after learning Technical SEO Tips from Simba
in

Technical SEO Tips

  • Choose A  Website Hosting Company That Is SEO Friendly. The hosting platform you choose for your website will impact website performance i.e. speed and experience of your customers. Ensure your host has (1) high uptime i.e. this ensures that your website is always live on the internet. (2) server location: this determines how quickly your users can access your website from any location around the world. You can make your website available around the world by using a CDN (Content Delivery Network) (3) Security: Choose a hosting plan that offers highest rated security (SSL) and will help backup your website in case it is hacked or has issues
  • Use A Friendly URL Structure. Your web page URLs are the first things search engine bots will see (while crawling) and your users (in the search results).
    • Include your primary (target keywords) in the url
    • Create a website URL structure that makes sense for your customers and helpful for search engine bots to know which web pages are important.
    • Avoid dynamic URL strings or ask Google to disregard certain URL parameters in Google Search Console in Configuration > URL Parameters or use your robots.txt file.
  • Choose One Preferred Version Of Your Website. Usually there are 2 versions of your website www and non-www
    • Choose ONE preferred domain name version for your website and use 301 redirects for the rest.
  • Create An Xml Sitemap: This helps search engine bots find your web pages easily (and quickly sometimes). Search engine bots also use your XML sitemap to determine which web pages are important. 
    • Only include important web pages in the sitemap. An XML sitemap is a list of web pages you recommend to be crawled.
    • Check the last modified tag in your xml sitemap as Google acknowledged  that they use the lastmod metadata to understand when the web page last changed and if it should be re-crawled. The last modified tag also signals to search engine bots how often you create new content (freshness). Google ignores the priority and change frequency metadata.
  • Find And Fix Bad Redirects
    • Single URL format: In addition to making sure HTTP always redirects to HTTPS, ensure the www or non-www uniform resource locator (URL) version is used exclusively, and that the alternative always redirects. Ensure this is the case for both HTTP and HTTPS and that all links use the proper URL format and do not redirect.
  • Check The Robots.Txt File For Any Errors. Robots.txt is a text file used to instruct search engine bots (also known as crawlers, robots, or spiders) how to crawl and index web pages. Any errors will cause indexing mistakes.
    • It is best practice to disallow comments i.e. Disallow: /*comments=all
    • Include links to your preferred sitemap(s) e.g. Sitemap: https://www.google.com/sitemap.xml
    • Disallow pages with thin or irrelevant content for ranking. Types of pages to disallow (exclude) include: pages with duplicate content, admin pages, shopping cart, thank you pages, dynamic product pages, internal search results pages etc. 
    • Use notes in the robots.txt file to help other people understand the directives contained in the sitemap. E.g. #we have included this page because….
  • Identify (And Fix) Crawl Errors
  • Find Out How Google Views Your Page Using Google Search Console > Fetch As Google (Free)
    • If Google can’t fully access your page, it won’t rank.
  • Check Mobile-Friendly With Google’s Mobile-friendly Test (Free)
    • Google recently launched a new “Mobile-First Index“.
    • This means: If your site isn’t mobile-optimized, it’s not going to rank very well. Most searches take place on mobile devices as opposed to desktop. So having a mobile-friendly website is more important than ever.
  • Scan Your Website For Broken Links With Drlinkcheck.Com.(Free). Broken Links Can Really Hurt Your Seo. Broken links are bad for user experience
  • Optimize Your Website For Crawl Budget. Google has a crawl rate limit for every website. For large websites, you need an optimal structure to make it easy for crawlers to crawl and index your web pages.
    • If you use Javascript, read about Javascript SEO in the resources and ensure Javascript code is free of errors. Javascript errors will use up your crawl budget resulting in some web pages not being indexed.
    • Avoid duplicate content. For large websites, duplicate content confuses the search engine bots as they do not know which is the preferred web page to rank in the search results.
    • Check The Index Size. Does the index size reported in Google Search Console (GSC) match the ‘real’ number of your web pages?
    • Check How Long It Takes For Google To Index Your Updated Page Titles And Meta Descriptions. If it is taking too long, then you have technical SEO issues – something is using up your crawl budget and affecting the crawl and indexing frequency.
    • Look Out For Crawler Traps. Crawler traps are website structural issues that cause search engine crawlers to find irrelevant URLs resulting in an index bloat. Crawler traps will hurt your crawl budget (i.e. search engine bots are spending your crawl budget on useless web page URLs than your important web pages) and cause duplicate content.  Common crawler traps include: URLs with query parameters, infinite redirect loops, links to internal searches, dynamically generated content, infinite calendar pages and faulty links. 
      • Avoid crawler traps by using the noindex directive in your website’s robots.txt file or use nofollow HTML attribute 
  • Improve Website Speed
    • Reduce HTTP Calls. For WordPress websites you can reduce the number of plugins you use, also use SEO friendly WordPress plugins.
    • Enable Browser Caching. With browser caching enabled, the elements of your web page are stored in your visitors’ web browser so the next time they visit your site, or when they visit another web page, their browser can load the page without having to send another HTTP request to the server for any of the cached elements.
    • Enable File Compression. This reduces the amount of time it takes to download HTML, CSS, Javascript files 
  • Secure Your Site With HTTPS. This is a confirmed Google Ranking Signal. HTTPs will protect your visitors’ data. This is especially important if you have any contact forms on your site. If you’re asking for passwords or payment information, then it’s not just important, it’s an absolute must.
  • ID and Fix Duplicate Meta Tags Google Search Console > Go to “Search Appearance” –> “HTML Improvements” (Free)
  • Install And Use Breadcrumbs Navigation Across Your Website
  • Fix Duplicate Content Issues With Canonical Tags
    • Duplicate content occurs when you have two or more similar or identical pages on your website. This can cause problems as Google may not know which of the pages, if any, should rank. Fix these by canonicalizing the affected pages to a “master” page.
  • Monitor Site Uptime
    • Use a free uptime monitoring tool such as Pingdom or UptimeRobot to verify that your site’s uptime is reasonable. In general, you should aim for uptime of 99.999 percent. Dropping to 99.9 percent is sketchy, and falling to 99 percent is completely unacceptable. Look for web host uptime guarantees, how they will compensate you when those guarantees are broken, and hold them to their word with monitoring tools
  • Scan Your Website For Malware:
    • Most web browsers now display a “This site may be hacked” warning on a website that may be hacked or unsafe for searchers.
  • Check For Server.
    • Crawl your site with a tool such as Screaming Frog. You should not find any 301 or 302 redirects, because if you do, it means that you are linking to URLs that redirect. Update any links that redirect. Prioritize removing links to any 404 or 5xx pages, since these pages don’t exist at all, or are broken. Block 403 (forbidden) pages with robot
  • Check For Noindex And Nofollow
    • Once your site is public, use a crawler to verify that no pages are unintentionally noindexed and that no pages or links are nofollow at all. The noindex tag tells the search engines not to put the page in the search index, which should only be done for duplicate content and content you don’t want to show up in search results. The nofollow tag tells the search engines not to pass PageRank from the page, which you should never do to your own content.
  • Rss Feeds
    • While rich site summary (RSS) feeds are no longer widely used by the general population, RSS feeds are often used by crawlers and can pick up additional links, useful primarily for indexing. Include a rel=alternate to indicate your RSS feed in the source code, and verify that your RSS feed functions properly with a reader.
  • Rich Snippets: 
    • If you are using semantic markup, verify that the rich snippets are showing properly and that they are not broken. If either is the case, validate your markup to ensure there are no errors. It is possible that Google simply won’t show the rich snippets anyway, but if they are missing, it is important to verify that errors aren’t responsible.
  • Make Your Website’s Structure Clear, Intuitive And Up-to-date. The way you organize a site architecture, and its navigation is crucial for both SEO and your visitors. Search engines go through a link structure to find and index pages. If your site is structured well, all the pages and subpages will be easily found and indexed by search engine crawlers. Also, intuitive navigation will work for your visitors as it will help them find what they came for in the least amount of time possible. A ‘three clicks’ rule says that any information on a website should be available for a user within no more than three clicks. This is how you should plan it.
  • Monitor Your Website Performance. Page speed relies on many factors with the way the site is built, optimized for performance and the hosting being key factors. So testing for speed is something that should be done on a continuous basis to ensure your site is always available and always responding. A recent statistic from Google research details that for every additional second it takes a page to load, conversions can fall by 20% – so ensure your site is blistering fast 24 hours a day 7 days a week.
    • Review Site Speed In Google Analytics: Behaviour > Page Speed
    • Monitor Messages From Search Console Re Uptime And Availability
    • Do A Monthly Seo Health Check: Even without signing up for a third-party tool, you can still do a basic SEO check with Search Console. Using it you can check for large issues such as a drop in traffic or to see if your site has been penalized by Google in some way. Search Console will email you with major issues, but proactive checks will keep you ahead. There’s a wide range of improvements that you can make to your SEO using Search Console information. From checking your robots.txt files and XML sitemaps to monitoring for new broken links, a weekly check (as little as 5 minutes) can help keep your site in tip-top shape.
  • Optimize Website For Mobile-friendliness By Utilizing Responsive Design.
  • Fix Any Duplicate Content Issues:
    • One common SEO issue that can be tackled is duplicate content. This is when there is more than one way to reach the same content on your site. 
    • often a tidy of your settings or use of what are called *canonical* tags can help remedy the situation.
  • Fetch & Render. Within Google Search Console exists a tool that allows you to fetch and render your pages. This allows you to see how the search engine perceives your pages.
  • Remove anything that slows down your site. Page load speed is a ranking factor in SEO. Page load times are important, so get rid of any non-essentials that bog down your website. These may include music players, large images, flash graphics, and unnecessary plugins.
  • Place Your Javascript At The Bottom To Avoid Users Seeing A Blank Page While Javascript Loads.
  • Install And Set Up Google Analytics And Google Search Console (Seo Tools)To Measure Seo Efforts, The Behavior Of Users Once On Site And Whether They Are Converting. (To See Crawl, Submit A Sitemap, See Broken Links)
    • Register with Bing Webmaster Tools, register with Google Search Console, 
    • Google Search Console (Search Console (previously Webmaster Tools) is a free service that helps you monitor the health of your website. From how well the search engine can access and crawl your site to how many pages are indexed. Search Console will also make recommendations for basic improvements to the HTML of your website.)
  • Use The Right Keywords In Your Images
  • Check The Number Of Indexed Web Pages.
    • Screaming Frog will give you an idea of the number of HTML pages on your site. You can then double-check how many pages you have indexed in Google Search Console.
    • Make sure your website is indexed in search engines. A lot of search engines will automatically find and index your content, but don’t count on it. You want to be sure engines like Google, Bing, and Yahoo are crawling your site so that people are finding you online. (You can add them directly if they’re not.)
  • Manually Crawl Your Website To Check Urls, Page Titles, Meta Description, Header Tags
  • Do Regular Backups Of Your Website.
  • Implement The Speakable Schema.Org Property. Identify sections within an article or webpage that are best suited for audio playback using text-to-speech (TTS). Adding a speakable schema.org markup allows search engines and other applications to identify content to read aloud on Google Assistant-enabled devices using TTS. Webpages with speakable structured data can use the Google Assistant to distribute the content through new channels and reach a wider base of users.
  • Do Not Include Or Use Front-end Javascript Libraries With Known Security Vulnerabilities
  • Improve your first contentful paint by minimizing render-blocking resources, use HTTP caching, minify and compress text-based assets, do less JavaScript work and optimize the critical rendering path.
  • Optimize Critical Rendering Path:  CRP is  the sequence of steps the browser goes through to convert the HTML, CSS, and JavaScript into pixels on the screen. The quicker the browser can do this the quicker your website. There are three spots where you can optimize the critical rendering path and thus the speed with which the browser produces a visible result for the user.
    • Minimize the Bytes that Go Down the Network
    • Minimize Render Blocking CSS
    • Minimize Parser Blocking JavaScript
  • Analyze Your Website Log Files. Log file analysis helps you understand how Google and other search engines view your website during the crawling. By browsing your website, crawling bots (such as Googlebot) leave information in your server’s log files. This information helps you know any crawling issues, detect and correct broken links, check and validate redirections, monitor speed of web pages, find which web pages are crawled the most etc.
  • Implement Javascript SEO. Because many websites use JavaScript frameworks and libraries like Angular, React, Vue.js and Polymer that present many issues for search engines when it comes crawling. 
    • Read the Ultimate Guide To Javascript SEO by Tomasz Rudzki
    • Focus on crawlability, renderability and crawl budget. Javascript errors are known to impact all 3 with implications for your rankings in the search results.
    • Check Your Javascript For Errors. Javascript errors are less forgiving than HTML errors.

Comments

Leave a Reply

Your email address will not be published.