Top Website Design Research

Top Website Design Research

Top Website Design Research Points

  • Use F-Shaped Pattern
  • Use Z-Shaped Pattern
  • Don’t let people more than 3 click to find their answer
  • Too many options ensure NONE will be chosen
  • Visitors read long widths of text faster, but prefer shorter widths
  • Your headlines draw even more eyes than images!
  • Image captions are the most consistently read in-post content
  • People follow the “line of sight” of other people
  • Don’t Make Users Wait: Speed Up Your Website
  • Make Your Content Easily Readable
  • Don’t Worry About “The Fold” and Vertical Scrolling
  • Place Important Content on the Left of a Web Page
  • Small Details Make a Huge Difference
  • Don’t Rely on Search as a Crutch to Bad Navigation
  • Your Home Page Isn’t As Important as You Think
  • Golden Ration in Web Design
    • Considering 900px total width page
      • Content area + Side bar
      • 900px / 1.62 = 555.55px = content area
      • Side bar = 345px
    • Considering for a rectangle block with 300px width
      • What is the height?
      • 300px/1.62 = 185px height
  • Mathematics and Design – Golden ration +The Rule of Thirds + Grid Systems


Other Resources

Image Source : Pexels

Click Tracking [Google Analytics]

Click Tracking Google Analytics

Event Tracking

On Page Implementation


Custom Fields in bbPress Topic Form

Custom Fields in bbPress Topic Form

Creating Fields

Saving Custom Fields

Displaying Fields on the Topic Page


Permanent 301 Redirect Setup in .htaccess

Permanent 301 Redirect Setup in .htaccess

Redirect old domain to new domain

Force Redirect to NON-www version

Force Redirect to WWW version

Redirect from HTTPS to HTTP

Redirect from HTTP to HTTPS

Redirect files with certain extension

Redirect Individual Files

Considering Two Situations

  1. Redirecting File within the same domain
  2. Redirect File to Another Domain

Redirect File within the Same Domain

Redirect File to Another Domain

WordPress Robots.txt Sample

WordPress Robots.txt Sample

WordPress Robots.txt

Adding Sitemaps to WordPress Robots.txt


Allowing all Bots

  • Allowing any Bots to Crawl

Not Allowing any Bots

  • Not Allowing any Bots to Crawl

Block a Folder

Block a File

 Block a page and/or a directory named private

Block All Sub Folders starting with private

Block URL’s end with

Block URL’s which includes Question Mark (?)

Block a File Type

Block all Paginated pages which don’t have “?” at the end

  • ( Allow )
  • ( Not Allow )

Helps us to Block Paginated pages from Crawling

Using Hash

Bots / User Agents

Top 10 Bots

Googlebot Mobile

Individual Crawl request for each Bots

Cross Domain Tracking [ Google Analytics ]

Cross Domain Tracking Google Analytics

Cross Domain Tracking in Google Analytics is for tracking a Visitor across different domains


Tracking the same visitor navigating across

  1. Various Domain names
  2. Various Sub Domains
  3. http and https of the same/different domain
  4. 3rd Party Shopping Carts
  5. IFrame


( Example )

Consider two domains :

  1. : Primary Domain
  2. : Secondary Domain

Primary Domain

Secondary Domain

Three  or More Domains


SEO Cannibalisation

SEO Cannibalisation

SEO Cannibalisation is a process of Consolidating many similar pages which dilutes SEO Value, in to a single page.

Similar Pages

  • Many Pages with Exactly Similar Page Titles
  • Many Pages with content targeting the same keyword

Reason for SEO Cannibalisation

Have too many similar pages?

  • It could confuse Google
  • It could dilute the SEO Value
  • It could decrease the overall Conversion Rate
  • It could confuse Website visitors
  • It lacks focus

So we need to do Keyword Cannibalisation

SEO Dilution

Issues on having too many similar Pages: 

  • Internal Linking – Linking to various pages with same Anchor Text dilutes SEO value
  • Backlinks – Dilution of Backlinks to various similar pages, decreasing the overall value if its linked to a single page.
  • Content Quality – Ideas and research about a single topic divided across many different pages dilutes content quality
  • Conversion Rate  – Various similar pages have various conversion rates,  so the overall conversion rate will be less


Considering similar pages,

Choosing THE Best Page to Cannibalise

A Page with

  • More Conversion Rate
  • High Quality Content
  • High Quality Backlinks

Consolidate SEO Value

  • Consolidate Content from other similar pages to The Single Page
  • Update Internal Links & its Anchor Text to The Single Page
  • Update Backlinks from others sites to The Single Page if possible
  • Use 301 Redirect from other pages to The Single Page

Advantages of SEO Cannibalisation

  • Google is not confused now
  • Solves the issue of SEO Dilution
  • Website Visitors are not confused now
  • Improves Overall Conversion Rate

Its Good to FOCUS!!!

SEO for a Historical Website

SEO for a Historical Website

SEO ( Search Engine Optimisation ) for the website which has a big history behind it. It is a kind of Website which has been active for a long period of time with regular activities.

Refining the Website Structure

  • Proper Internal Linking

Improving the User Interface of the Website

  • Making it possible for the user to navigate across the historical website and be able to get relevant information
  • Allowing Social Sharing
  • Making it more interactive

Fixing the 404 Error Backlinks

  • 404 is a HTTP status code, which tells us that the there is a broken backlink. Hence wasting the Link Juice
  • Using a redirect from the broken link to an appropriate page would add extra value to that page and the website

Off site backlink cleanup

  • Websites “Old Backlink” research,
  • And Removing unwanted backlinks

Improving the Title and Meta Description

  • Title Contributes more towards increasing Click Through Rate
  • Meta Description is the second important thing which contributes towards Click Through Rate

Implementing Rich Snippets

It is also called as “adding semantics to the website”

  • Making Google understand each part of the website
  • Highlighting to Google the meaning of each section of the website
    • Use Webmasters Tools to Highlight this information
    • Or It can also implemented via the backend coding

Other ideas for the Historical Website

  • Setting up a XML sitemap for Google
    • for Normal Pages
    • for Images
    • for News
    • and setting up its Priority and its frequency
    • Finally, Submit the XML sitemap to Google via Webmasters Tool
  • Check for duplicate Titles, Duplicate content, short meta description issues and fix it.
  • Check if Canonicalisation is necessary
  • Check if Cannibalisation is necessary