Onsite Optimization Techniques Post-Panda Slap 2011

Guna
DMCA.com by - 11/24/2011 12821 Views

Panda Provoked, Re-engineered Onsite SEO Strategy

Google PandaWith the recent Google Panda update in October 2011, we witnessed a huge uproar among webmasters and website owners as a lot of sites with high PageRanks fell like a heap of brick, while some others which had authentic organic links and well optimized pages climbed a few places more, unexpectedly. Panda has left a lot of sites depleted and starved for traffic, while leaving the SEO industry at odds to find answers to mediate the effects of the Google algorithms. As the webmasters didn’t know of a recourse to sort things out and stabilize the fall, they were meddling in the dark on a course unknown till Google issued its new set of indices to rate and review websites, which made it pretty clear that unless websites were structured and built according to the new guidelines, there was no scope for improvement.

The upshot of the Google Panda algorithmic update was aimed at removing poor quality sites from the top of Google’s search results, while increasing the PageRank of sites that were structured optimally and had real organic value with better back links and original content. Google’s new strategy is totally user oriented, aimed at increasing user loyalty and user experience by improving the quality of search engine results at the same time bringing down BlackHat reliant SEO sites that had till then being exploiting Google’s vulnerabilities. The Panda update is focused on deterring websites that rank high on search engine results through link farming and other BlackHat approaches rather than serving end users through long term organic SEO strategies. There are technical as well as non-technical issues that have to be looked into while rebuilding or re-optimizing a site after the Panda update, which infact complement each other. Find a result oriented, WhiteHat, Onsite SEO services provider to optimize your site with the latest SEO strategies.

Some defining Factors for Being Hit by Google Panda

  • Existence of duplicate, low value and unnecessary URLs
  • Lack of unique title tags for each page
  • Lack of precise meta-descriptions
  • Un-crawl-able site structures
  • Non-existence of HTML and XML sitemaps
  • Duplicate content on pages
  • Number of pages with a low amount of original content.
  • A high amount of inappropriate adverts which are or are not relevant to the topic on the page
  • Page content and title tag irrelevant to the content on the page
  • Unnatural language or overstuffing of keywords on a page
  • High bounce rates and exit rates
  • Boilerplate content
  • Pages high on JS, AJAX, frames, Flash content
  • Low quality inbound links to a page or site and prevalence of unnatural links
  • Very few or no links to a page or site in Social Media and other sites with good PageRank.

On-Page Technical guidelines to beat the Panda Slap

1.Create Effective Title Tags

Title Tag OptimizationTitle tag is one of the crucial guides for crawlers to evaluate webpage and verify their relevancy as search results display webpage titles as links during the very first line of each query result.

DO: Keep title tag unique, short, informative and specifically designated to the relevant content on each page.

DO NOT: Stuff Keywords, give vague titles or duplicate title on different pages.

2. Optimize META Description Tag

META TAG OptimizationMeta Description is a brief on what the page is all about. Meta description that appear on the search engine results aid searchers to evaluate the content of the page, and can be a deciding factor for a first time user from either visiting your site or not.

DO: Keep it within specified character restrictions with the optimal use of keywords, while attracting the searcher with a call to action.

DO NOT: Duplicate, create expansive descriptions that fail to hit the point, or overshoot with repetitive keywords.

3. Create better URL structures

A well made URL with the right keywords and minimal number of parameters makes it easy for GoogleBot to crawl and users to remember without difficulty and create easy linkages. If your URL contains a search term that users might key in to search boxes it makes it all the more crawl-able.

DO:
  • Create simple, search engine friendly URLs that are text based and do not contain numerals or symbols.
  • Punctuate URLs with hyphens.
  • Use keyword inclusive URLs to aids easy crawl-ability.
  • Operate 301 redirects to access canonical URLs when there is access to the same site through different URLs.
  • Employ rel=“canonical” link element if you have multiple URLs.
  • Make clear differentiation between root directory URL and sub-directory URLs.
  • Make use of 301 redirect to direct traffic from no-preferred domains.
  • Use Robots.txt to block access to problematic URLs.
DO NOT:
  • Use long and cryptic URLs that intimidate both users and crawlers or punctuate with underscores. 
  • Use different versions of URLs – HTTP, PHP, ASP to access the same page.
  • Use upper-case alphabets on the URL.

4. Design user-friendly Site Navigation Architecture

Site NavigationDefine the best and most user-friendly route for users to navigate from a home or root page to the rest of the pages on the website by creating a naturally flowing hierarchy of logical and sensible down-links to specific info-based pages. This enhances both user and crawler capabilities. Moreover, each page with a breadcrumbs list makes it easier for users to get back to the previous page or to the home page, without having to backtrack through all visited pages. As model technologies like CSS and JavaScript, can interfere with search engines potential to spider a site, minimize usage unless there isn’t an option. But the very recent Google algorithm update ensures crawling of JS, Flash, and AJAX.

DO:
  • Prioritize pages based on traffic volume and the website’s USP.
  • Create options for root pages to load when the last portion of the URL is removed.
  • Create linkages to pages with similar or related content.
  • Use text based links to aid easy navigation for users and assist GoogleBot crawling & indexing.
  • Create HTML site map for users and XML sitemap for search engine spiders.
  • Configure webserver to 404 HTTP status code with links to the root page when a non-existent page is requested.
  • Indicate canonical version of URL in XML sitemap.
  • Select anchor texts that are self explanatory.
  • Clearly distinguish between anchor text and others by using different color.
  • Use anchor text links for both internal as well as external navigation.
  • Store image files in one directory and creating an image site map simplifies the path to your image files.
DO NOT:
  • Create a random or illogical pattern of linkages that would kill time and deter users from accessing specified page with ease.
  • Frustrate users by linking each page on your site to every other page.
  • Use images, drop-down menus, animations, Flash or Java Script to create linkages.
  • Let HTML sitemap become redundant with broken links.
  • Create unorganized HTML sitemap that lists pages without subject-wise sorting.
  • Design a 404 page that is inconsistent with other pages on the site.
  • Use excessive keywords and long phrases as anchor texts.
  • Do not use multiple sub-directories to store images.

5. Optimize Images, Audio and Video Files for Better Crawling

While most browsers support images in JPEG, GIF, PNG & BMP formats, there are some that might not. Hence it is always worthwhile to use “alt” attribute for images that permits alternative text descriptions for images, incase it is does not show up. Furthermore, using “alt” attribute would help crawlers take notice of the text tag, which might otherwise be rendered unreadable as in graphics. When using images as links to other pages or sites, the alt text for the images would be considered akin to anchor texts by the crawlers.
One major factor that could be annoying to web users is having vast number of video and audio files embedded in your web pages, which could increase the page loading time or the PageSpeed, a determining factor in terms of PageRank. 

DO:
  • Use brief, descriptive alt text tags for images instead of generic tags.
  • Using images with alt text for links enhances readability by the search engine spiders as well as helps easy access to the outbound links.
  • Match extension of the image file name with the image file type.
  • Create an image sitemap file for GoogleBot to crawl and index effortlessly.
  • Use Flash Video (.flv) compression for your video files.
  • Use (.mp3) compression for your audio files
DO NOT:
  • Use only images to link to other pages and sites.
  • Use image links where anchor texts can be used.
  • Use generic image file names like ‘image1.jpg’
  • Use long file names in the alt text for images.
  • Stuff keywords into alt text or use entire sentences for links.
  • Use (.avi) or (.mpg) for video and (.wav) for audio playback

6. Access Control With Robots.txt, rel=“nofollow”, “NOINDEX”, .htaccess

Some sections or pages of a website may contain personal images, website administration folders, customer test folder, and folders of no search value like Cgi-bin which need not be crawled or indexed by the bots or spiders. Placing a robots.txt file in the root directory of a site will dictate Search Spiders/Bots, whether or not to crawl particular pages or parts of the site so that it can be exclude from indexing as it acts like an electronic no-trespassing measure. With robots.txt webmasters can prevent indexing totally, prevent access to particular areas of one’s website or create individual indexing instructions for engines. Webmasters can password protect entire site, directory or sub-directory with .htaccess and it is one of the most secure ways to keep bots away from crawling as bots are not designed to guess out or come up with passwords. One other technique to block content from appearing in search results can be done by adding "NOINDEX" to your robots meta tag. Google Webmaster Tools come in handy when there is need to remove content that has already been crawled.

DO:
  • Create separate robots.txt file for sub-domains content.
  • Use .htaccess or encrypt content as a secure way to block crawler and user intrusion.
  • Add "NOINDEX" to the robots meta tag to prevent search engine visibilty.
  • Use Google Webmaster tools to remove content that has alredy been crawled.
DO NOT:

Allow URLs created by proxy servers or referral sites to be crawled.

7. Boost PageSpeed for Better PageRanks

There has been a marked increase in the frequency of changes to Google algorithms over the past year or so, and almost of them have been aimed at providing users with the most relevant results and a great user experience. PageSpeed or page loading speed is now a part of Google's ranking algorithm and would have a definitive say on your PageRank and SERP. Sites with quick loading times are sure to increase user satisfaction and improve the overall quality of the web particularly for users with slow Internet connections. Graphics are important to web pages, but they also take time to download and can be a major hindrance to increasing PageSpeed.

Page Speed ServiceIf you have page load issues, and are looking to rectify, make sure that the your website server is hosted with low ping values & it is not over-loaded, especially if you are using dynamic files like asp, asp.net, php, ruby, server over-load can kill the performance. Also, consider switching on to a Virtual Private server  with guaranteed RAM & CPU usage or if you have a website that is already popular, a dedicated server will be the best performing one. Some of the major speed deterrents are HTTP request optimization, combining and compressing external files, loading JavaScript asynchronously, using cookie-less domains, etc.

DO:
  • Reduce the number of HTTP requests by minimizing design enhancing CSS files, Javascript library references, and images, which is prone to affect PageSpeed.
  • Minimize or avoid use of plug-ins as the extra page-load time may not be worth it.
  • Resize and optimize JPEG, GIF, PNG & BMP image files as they contain loads of irrelevant metadata which can drastically increase page loading time.
  • Compress images before uploading them on image-intensive pages.
  • Compress files at the server level before sending them to browsers
  • Remove whitespace before serving your code.
  • Use Content Delivery Networks to turbo-charge the speed of your site.
  • Include all interface related style sheet references should be included in the <head> of your document.
DO NOT:
  • Upload and serve images larger than what is required for the design. Upload large images directly off digital cameras before compression.
  • Never display un-styled content to visitors.

8. Design & Page Layout

With the recent Google Algorithm update all Java Script, AJAX, Flash files can be crawled by GoogleBot which is certainly a boost to sites that are rich in graphic data. Page layout and design based on the above platforms have been a deterrent to better crawling by GoogleBot which stands rectified now.

The latest Google algorithm has come down heavily on sites that had Adcluttered and on Adheavy pages, which was discussed on our previous article. Adheavy pages could result in loss of PageRank. Limit advertisements to not more than 2 and move them either to the sidebar, header or footer so that they do not figure within the text.

Web server that support the If-Modified-Since HTTP header allows web servers to inform Google whether or not the website content has been altered since it was last crawled by GoogleBot. Supporting this feature saves you bandwidth and overhead. Take advantage of some of the best Onsite SEO service packages that make use of the latest SEO strategies in deriving better PageRanks.

Category :

SEO News

Tags :

On site Optimization, On site Optimization Techniques, Panda Slap 2011, Onsite SEO Strategy, OnPage Technical guidelines, SEO News, Google, Page Speed Service

About Guna

Guna Guna is Web Solutions Architect and  Consultant focused on developing and implementing turnkey SEO, SEM, Social Media Marketing, and Web .... more info about the author