Classification criteria for a website

To rank sites, search engines take into account the following aspects:

  • Domain Registration Time (Domain Age)
  • Content age
  • Content frequency: how often new content is added
  • Text size: number of words above 200-250 (did not affect Google in 2005
  • Link age and reputation of the pointing site
  • Standard page features
  • Negative score on page characteristics (eg reduction for websites with extensive use of keyword meta-tags, indicative of having been artificially optimized)
  • Content originality
  • Relevant terms used in the content (the terms that search engines associate as being related to the main theme of the page)
  • Google Pagerank (only used in Google algorithm)
  • Number of external links
  • Anchor text contained in external links
  • Research citations and sources (indicates that the content is research quality)
  • Related terms in search engine database (finance/finance)
  • Negative score for inbound links (probably coming from low value pages, reciprocal inbound links, etc.)
  • Incoming link acquisition rate: too many or too fast increment may indicate link-buying commercial activity
  • Text next to outbound links and inbound links. A link accompanied by the words “sponsored links” can be ignored
  • Using the “rel=nofollow” tag to sculpt the website’s internal ranking
  • Document depth on the website
  • Metrics collected from other sources, such as monitoring how often users return by clicking back when SERPs send them to a particular page (Bouncerate)
  • Metrics collected from sources such as Google Toolbar, Google AdWords/Adsense, etc.
  • Metrics collected from sharing data with third parties (such as statistical data providers of programs used to monitor website (website) traffic)
  • Rate of removal of links pointing to the site. For SEO Services Click here
  • Sub-domain usage, keyword usage in sub-domains and volume of content in sub-domains, with negative score for this activity
  • Semantic connections of served documents
  • IP of the hosting service and the number/quality of other sites hosted there
  • Using 301 redirects instead of 302 redirects (temporary)
  • Show a 404 error header instead of 200 for pages that don’t exist
  • Proper use of robots.txt file
  • “Broken” links
  • Unsafe or illegal content
  • HTML coding quality, presence of errors in the code
  • Actual click-through rate seen by search engine for listings displayed on SERPs
  • Importance ranking made by humans on the most frequently accessed pages – ODP