Every business consult an SEO at some point of time and as a Webmaster understanding the basic of SEO always helps. So let me share a complete list basic SEO tips every Webmaster should know.
Any online business needs to consult an SEO at some point of time and as a Webmaster understanding SEO always helps. So let me share a complete list SEO tips that every Webmaster should know.
- Links And Anchor Text Matters
- Use of Robots.txt
- How to Contact Webmasters
- Keyword Research
- Check for Duplicate Content
- Use Traffic Analysis Tools Efficiently
- Google Operators
- Browser plugins to show nofollow links
- Understand Meta Tags
- Google Authorship
- Google +1 Integration
- Google Webmaster Tools
- Bing Webmaster Tools
- Submit XML Sitemaps to Search Engines
- W3C Validator
- Google Alerts
- Canonical URLs
- Analyze Server Error Logs
- Analyze site Performance
- Mobile SEO
- Check Server Header
- WayBack Machine
- FTC Guidelines
1. Links And Anchor Text Matters
Link building has been the backbone of SEO for years and in 2013 still it is the biggest part of SEO.
128 SEO Professional Survey by MOZ in June 2013 indicates link building is approximately 40% of all the SEO factors.
Link building has evolved a lot over time. May be 5 to 7 years ago you could easily outrank other sites with link exchanges, Couple of years back you could outrank sites with links from blog networks. As of today you can write Guest Posts or Press Releases for links but a recent August updates will make links from PR Site invalid and may be in few months even guest posts done for the sake of only links may.
The steps by Google is not to make link building less important but it is to make links even more important because Google is not decreasing the value of links in its algorithm but it is actually making its algorithm such that links for the sake of links has lesser value in its algorithm. Finding ways to build links that looks more natural than a link building process is the future of link building.
2. Use of Robots.txt
Its robots.txt and not
robot.txt which is where many webmasters goof-up. Robots.txt was used to block search engine bots from accessing part (or whole) of a website. As it is normally used for blocking search engines to websites, very few SEO experts make full use of it thinking more pages Google indexes from the domain the better but it is completely false.
If Google finds tons of pages on your site with very thin content (or links to pages available to only members), it may not value your valuable content as much as it should assuming the site has more thin content on domain than solid content.
Ideally robots.txt should block pages of the site that may not bring traffic to the site from search engines. Some of the example pages that should be blocked are
- Images directories
- Site admin Area or Moderator Area
- Site Search/li>
- Member profile Pages
- User Control Panel Pages
The above list varies depending on the type of site. As an example a social networking site may not want to be blocking the member profile pages because they may be targeting name of the members as keywords in search engines or may be image-hosting site may not block the site search option because those are content pages for that kind of site.
In the opening few lines for robots.txt I mentioned – “robots.txt was used” and not “robots.txt is used” which means nowadays there is more than just blocking search engines with robots.txt which is, you can specify the XML Sitemap files in your robots.txt file.
3. How to Contact Webmasters
Every real website has an option to get in touch with the Webmaster or the site owner. Be it Mark Zuckerberg for Facebook or Larry Page for Google or any other website. The point is to understand whom you should be contacting and through what medium. If you only use the whois email to get in touch with the Webmaster, you may actually land up on the wrong person’s inbox and your email just gets deleted.
I often get lot of SEO proposal in my inbox that looks pretty similar to this one.
There is no way that this email is genuine because my contact page on Go4Expert or any other page on my site does not have that email displayed publicly to be used and so it is either they used one of the forum emails to grab that email address or the whois information to get hold of that email and send me those emails. In either case I can safely delete that email or report that email as spam.
4. Keyword Research
Keyword research is the fundamental building block for SEO (as well as PPC Marketing) and if you don’t use the right keywords that people are searching for, you may be wasting lot of your SEO efforts. Check out the complete guide to keyword research.
Keywords research with wrong tools can lead to drastically bad results for SEO. If you are using Google Adwords Keywords Research tool for SEO, check out Why I don’t Recommend Google’s keyword Tool for SEO.
5. Check for Duplicate Content
Copyscape is free tool that allows you to check any site for duplicate content. The free tools allows only 5 free scans daily per domain and you can only check pages already online but if you are outsourcing your content or you have writer write content for you, it makes complete sense to be using their pro services to check the content before making it live on your website to avoid being penalizing for duplicate content.
There are other such duplicate content detection services but none of them are as good as Copyscape.
Firebug is a Firefox plugin (Also available for other browser in the form of Firebug Lite) that helps you inspect and modify HTML, CSS and site layout in real-time.
You may be wondering how Firebug can help with SEO.
SEO is Web Development and Web Development is SEO and every Webmasters should understand basics of Web Development like site’s HTML structure, CSS properties etc. As an example if you are getting a new look for your site or getting a brand new site developed, you should be able to verify that your developer is writing the right kind of HTML layout to your web pages.
Firebug is the best plugin that any Web Developer can dream off and it can also be extended to add features like Page Speed, YSLow to Analyze page loading time in real time to help further in SEO.
7. Use Traffic Analysis Tools Efficiently
Every Webmaster should be able to use Google Analytics as well as other Analytics tools like StatCounter, GetClicky, KissMetrics, Crazy Egg to track the right information for their website and business as a whole and get the maximum out of those tools.
8. Google Operators
There are hundreds of Google Operators that can be used to make the life of an SEO easier by allowing search results to be well targeted. Here is complete list of Google Operators.
9. Browser plugins to show nofollow links
When doing SEO, human errors can always creep in when the output of adding a nofollow to link does not make any visual difference. Having a plugin that can show nofollow links differently than other type of links always help. It not only helps you make the needed links as nofollow but can also help you find links that you don’t want to be nofollow but you have added it accidentally.
There are quite a few Firefox Plugins like this or this but I prefer using ChromEdit Plus option as suggested by Matt Cutts for a reason that I can edit the CSS for the nofollow links because having a single pixel border to image links with nofollow attribute breaks the site design.
10. Understand Meta Tags
Meta tags are not only confined to keywords and description of a page but there are lot of others meta tags that can make or break site’s SEO. Meta tags like noindex, nofollow (This is different from the link’s relationship of NOFOLLOW attribute aka rel=”nofollow” which only prevents search engines from following the individual link.), noodp, noydir etc.
The Syntax for robots meta tag is
<meta name="robots" content="nofollow">
- noindex – Prevents page from being indexed.
- nofollow – Prevents search engines from following any links on the page.
- noarchive – Prevents a cached copy of a page.
- noodp – Blocks Open Directory Project description being used as search result description.
- noydir – Blocks Yahoo Directory Titles & Descriptions being used in Yahoo search results.
11. Google Authorship
Google is not only displaying the author image in search results but also making a move to explore Author Rank to be part of Google’s ranking algorithm any time soon.
12. Google +1 Integration
Moz recently published a study of Amazing Correlation Between Google +1s and Higher Search Rankings and soon after Matt Cutts, said that Google’s +1 has no direct impact on the web search ranking algorithm. Further to state that
If you make compelling content, people will link to it, like it, share it on Facebook, +1 it, etc.
Again I have never seen such quick response from Google on any study whatsoever and this makes me to believe more in the Moz’s Correlation. What it means is, to make your site SEO friendly, your site’s Google+ page should be linked with rel=”publisher” from your site as well as provide an easy to use interface to share content on Google+. Note that I am not talking about an option to +1 a URL or a page but an option to share the content or link on Google+.
13. Google Webmaster Tools
Google Webmaster Tools provides lot of reports to help improve site’s HTML and reduce errors. The couple of reports that I often use are
- “HTML Improvements” for duplicate or missing titles and duplicate or short meta descriptions
- 404 Error reports help me identify links that are not pointing to the correct pages of my site. The error could be that page has moved but I have not updated the old link or it could be that some one linked to my article but then messed up the link by adding few extra characters giving 404 errors. I can then contact the Webmaster to update the link or can even use a redirect on my site to redirect the users to the right content.
14. Bing Webmaster Tools
Bing Webmaster tool has some really cool SEO tools that aren’t even part of Google Webmaster tools as yet. Bing is really working on tools to help out Webmasters to SEO currently and if you remember Bing pioneered link Disavow tool and it is then Google joined the race.
Some of the Bing tools that I love are
- Bing Link Explorer Tool to help you analyze links to any site (read competitor) that you don’t own.
- SEO Analyzer to Analyze on Page SEO Elements that you may have missed.
15. Submit XML Sitemaps to Search Engines
Every CMS has an option to generate an XML Based Sitemap either through a plugin or as a part of the core functionality and also can notify search engines like Google / Bing about the change in the sitemap.
Note: The notification to Google about change in sitemap is different from submitting the sitemap in your Webmaster account.
Once you add the XML sitemap in your Google / Bing Webmaster’s account, you can view lot more details about the index status of the urls as well any error or warnings in the sitemap.
16. W3C Validator
HTML is a markup language that means you can have wrong or missing HTML tags and still have the same layout as you intent to.
An example could be you have an extra open DIV that you missed to close it at the end or it could be that you have an extra closing DIV tag at the end of your page that does not impact the layout of the content in browser but still this is an error that needs to be fixed. To make your HTML error free W3C HTML Validator is a handy tool that every Webmaster especially SEO should be aware of.
17. Google Alerts
Google Alerts is a service by Google to get email notifications for search terms you wish to monitor for Google search results. The ideal use of Google alerts is keeping an eye on your brand name or competitors brand name for being mentioned in blogs, forums or anywhere on internet.
18. Canonical URLs
You can have multiple urls to same or similar content and canonical URL is the preferred version among those set of similar content urls. An example of canonical url is – if you have product listing with sorting orders then a page can display results sorted alphabetically or based on other factors.
The sort parameter changes the content of the page very slightly making it almost same content on multiple urls and so you need to add rel=”canonical” link to the <head> section of the product page like this
<link rel="canonical" href="http://www.example.com/product-type-1.php" />
to let search engines know which is the best url for the product details.
19. Analyze Server Error Logs
SEO is not only confined to HTML and Web Development but it also includes the ability to Analyze server error logs for 404 not found link errors or Internal Server Errors or any other error. Google Webmaster tools will also report such errors but then waiting for Google to find and report them is not a wise thing to be doing and so analyzing server logs for error is a preferred choice.
20. Analyze site Performance
There are quite a few tools and plugins to help you analyze site’s speed and performance .
- Google Page Speed Insight as Web Interface or Browser Plugin,
- YSlow as Firefox Plugin by yslow.org,
- GTMetrx – A tool that uses Google’s Page Speed, YSlow as well as other tools to help you analyze pages for performance along with suggestions that are very easy to understand and implement.
Bonus Tip: There is page timing reports in Google Analytics under Content > Site Speed which shows the average page load time for users from initiation of the pageview (e.g. click on a page link) to complete loading of the page in the browser.
21. Mobile SEO
Smartphones users are growing exponentially not only in India but worldwide and Google has also taken the initiative of Mobile SEO i.e. remove sites from mobile search results that are not mobile friendly (Official announcement) or too slow on mobile devices (Not official yet but is being in highly discussed in SEO fraternity).
Making your site only mobile friendly is not enough and so you need to focus on making it really fast for the end user as well. If your site on mobile devices just hides few elements to make it responsive design, it may be good work around as of now but may not be a good idea in the long run.
22. Check Server Header
There are multiple ways of redirecting an old page (or url) to a new page (or url).
- Meta Redirect
- Temporary 302 Header Redirect
- Permanent 301 Header Redirect
Google recommends a 301 permanent redirect, which means your server header should contain the right header code of “301 Moved Permanently”.
As an end user, all the redirects will behave the same way but for making the redirects for Google as well as for your user, you need to be able to check server header response to be sure about your headers sending the right header code. Use SEOBook’s Server Header Checkup Tool to verify server headers.
23. WayBack Machine
Wayback Machine creates an archive of web pages that can be referred in the future. In SEO these days it is a very common statement – Things were working fine until . It is also very common that client hired someone to do SEO for his or her site and suddenly Google Panda or Google Penguin Update hit him or her. To view content of the site as it was in some previous date, WayBack Machine is the right tool. It helps understand what On Page SEO was done (or not done for so long) on the site that made SEO traffic to drop.
24. FTC Guidelines
Federal Trade Commission or FTC has made it mandatory to disclose promotional or paid content as well as affiliate links so be aware of the Complete FTC Disclosure Guidelines.
WordPress for SEO is almost unavoidable No matter what type of site you have, somewhere you will find a connection to WordPress. The typical 5 page website for your company can be in WordPress, an eCommerce site can have their blog in WordPress or you can have a site on WordPress.com just to pass links to your main site.
WordPress is unavoidable for SEO and so it makes complete sense to know about the basic working of WordPress.