I spent this evening reading about Link Building. The question is now, what is acceptable link building? The all mighty Google has recent issued warnings to us yet again. I know what used to be OK, but nowadays Google has said that any link that is paid is against Google’s guidelines and that Google plans on penalizing sellers and buyers of links.
I never quite understood penalizing buyers of links? Are you saying now, do not buy links for your website to rank well, buy bad links for your competitors sites so they get penalized? I think I may have to run an experiment on this .
Well it seems now that the masses are kind of fed up with Google’s rules, and I have read in a very popular SEO (Search Engine Optimization) forum that based upon Google’s latest guidelines that nearly every single SEO (Search Engine Optimization) company on earth is now going to be considered black hat (black hat, a term used to describe SEO companies that do not 100% obey Google). Google does not want people “optimizing anything”, they do not want link building, they accept link baiting (although link baiting it now the equivalent of fishing in a pond with 8 BILLION other pieces bait, even if you have the best bait in the pond how is the fish to find you? Google. Can you answer this?) So tighten your drawers, Google is going to change the rules, yet again.
Many SEO experts have been discussing whether anything is really going to change, or if they are just going to scream longer and louder this time? The truth be told that most sharp link builders are always going to be 3-4 steps ahead of Google, and Google will never catch up to these guys. So why bother, all Google is doing is increasing the profitability of the really good link builders. Links are the glue that keep the internet as one, and without them, the internet would be 100’s of 1000’s little internets all completely separate, can never do away with links Google.
Most SEO’s now know that Google does not look fondly upon: links in footers and sidebars, labeling links as sponsored ads or sponsored links, so these old school techniques are now fruitless and hardly worth paying for.
So what Google has done is raise the bar, what we are all going to see is the price of great links going way UP! In recent history, you could get a nice bunch of PR4-5 links for say $35 per month, well I bet in the next 6 months a solid PR5 link is going to be $250 a month, why you ask? Because Google is making it much harder to build out links that are going to pass link juice and count towards your search engine ranking and placement. Google has been applying discounts to nearly all links, footer links, sidebar links, site wide links and any kind of link floating around your website is probably only worth about 25% of what it used to be worth, because Google has been discounting these types of links further and further.
Knowing this a multitude of options and opportunities for links in text have been born. Google views these links in text as a premeditated link. Of course people are all over this theory. There are Pay for Review sites now (where you post a review of a site or product and they include links back to your website), there are 4 versions of software that will index a website and turn any piece of text into a link and so on.
So the next step in Google trying to stay ahead of the link market is relevancy. Not your old standby, a couple of sentences on topic, now you will be required to have an entire page probably 400 words or greater on topic (and you better not even try to have more then 5-6 links per page) or Google will probably not count those links.
Google is driving the price of good, under the radar links UP, so you better forget about the old stand buy link brokers, where quantity was the ticket. Now you better look for quality, as this is all that will matter, 1 super high quality link is going to be all you need to get vastly improved ranking in the near future.
It has been documented that Google, among other search engines, adds value to domains that have static IP addresses. I suppose that Google must view the expenditure on a static IP address as a sign of how serious the webmaster or domain owners’ intentions are moving forward. It would make sense that spam sites would go through a great expense to get all of their autogenerated websites on separate IP address’s. This would also go hand and hand with the information about the length of domain registration and age of domain having significant influence on your search results.
You may have heard of people in the link building world discussing C Class IP adddress. This is the 3 block of numbers in an IP address which is usually the one that remains static when assigning IP address to multiple domains on one server.
Example: An IP consist of 4 parts AAA.BBB.CCC.xxx – the first part is called A-class, the second part B-class and the third part C-class. So any IP under the last part will be in the same C-class. So for example:
111.222.333.1 and 111.222.333.2 are in the same C-class but
111.222.332.1 and 111.222.333.1 are in two different C-classes
Most small hosts manage very few C Class IP addresses. The search engines have discovered in the past link builders whom had multiple sites on the same C class IP linking to one another. Google has since devalued or removed any value from links from the same C class IP addresses. They have also reduced the ability of multiple sites on one IP to pass link juice to any other website.
Until recently hosts did not care about keeping sites on dedicated C class IP’s. Recently you have seen a rise on SEO (search engine optimization) hosting services. These are people who have figured this C class IP address formula out and are offering the ability to have lots of C class IP addresses.
Last night I was looking to build out some keywords lists, and I remembered about a few cool tools for developing and building out keyword lists.
One is called spyfu.
Spyfu gives you the ability to review your competitors keywords, what they are focusing on and what they are using successfully (this requires a little homework).
Another is called Keyword Spy
Keyword Spy’s services is similar to spyfu, although they offer greater reporting and a little less data than spyfu.
Once you have compiled your keywords, if your looking to go deeper, you can use a keyword combiner like adwords generator
Just a few basic tips for keyword discovery and manipulation.
MSN, while lacking on its webmaster tools, has recently added some new tools in their MSN adcenter labs.
These are free tools, which are just fantastic!
If you looking to optimize your website, for particular keywords, or looking to micro manage your PPC ads; the tool for detecting commercial intent tool rocks. This basically displays the typical intent of the searcher (based upon their search query) and the probability of the user to actually buy. You can also use this tool to query URL’s or domains to see what the typical visitor’s action or reaction is going to be based upon statisics.
They also have a Keyword forecast tool, which will offer you details of internet search volume on daily and monthly bases, the search trends and the demographics of the searches. They offer this in multiple views, as detailed as one could possibly want (this is quite unusual for MSN or Microsoft).
The tools they are offering for free are second to none, everything: Text ad writing recommendations, keyword group detection, search funnels, product classification, even a tool for keyword extraction and keyword price estimation coming soon.
MSN has also created powerful new tool with API capabilities called the Keyword Services Platform. This information is so valuable to managing your e-commerce business, I am just starting to dig into this, but it has real promise.
I cannot recommend these tools highly enough, get in there and dig for the gold nuggets you need for your business’s future success.
Microsoft has outdone themselves with this tool and not even the master of the net Google has yet to create such useful tools, nonetheless making them free.
OK I thought I should circle the wagons on the webmaster tools, and show you something that will compensate for the short comings of all search engines webmaster tools, the software is called SEOAdministrator.
This has some very good tools. Best of all it will easily run ranking reports for your site, stores the reports and shows you updates each and every time you run it. This lets you watch your search engine rankings rise and (hopefully not) fall, it also shows you how many places each way. It works with nearly all major search engines and as long as your searches do not get too extensive, it runs smoothly without the use of proxies etc. It offers the ability to export reports, and an unlimited amount of URL’s and search terms.
The software does have lots of other functions/services like: link analysis, site index tools, link exchange tools, among other things.
I use this exclusively, I have tried most other software packages out there, and this seems to be the most reliable of them all.
Does anyone know if Microsoft’s search is called MSN or Live? I have no idea. They seem to be separate mostly, but there is no MSN webmaster tools, only LIve Webmaster Tools.
Live Webmaster Tools is Microsoft’s first shot at giving webmasters some information about their site, and it is weak. It does give you the ability to submit a sitemap, and validate your website (by adding an XL file or meta tag to your site). Once validated, Live does show you little pieces of good information (more than Yahoo). Related to how Live views your website, it will let you type in a search term, which then displays which pages from your site are displayed in Live search. They have this weird scale 1-5 (little green dots) where it shows you how important your page is for a particular search query, which gives you a clue, but with a rank of 5 (your page being viewed as important to Live) you may only be in the top 25 search results. I am not sure why they can not just tell you exactly where you’re ranked? Anyway, they seem to have gotten married to teasing us, with this 1-5 rank, which is somewhat useless in the big picture, when you can simply search Live and find where your site is ranked anyway (who knows).
Unfortunately Live has maintained this dumb little 1-5 green bar thingy for all of their information. They have a domain rank, they will show you the most important external links on a scale of 1-5, your most important internal links on a scale of 1-5 , how many pages from your site are indexed and when the last time the Live spider indexed your pages. All great info, except for the 1-5 green bar…kill that someone!
Overall, Live has the right ideas, but they really need to dump this 1-5 grading dot chart and give us the nuts and bolts. Show me exactly where I rank for say my top 100 queries and show me all the links you see both external and internal. If Live makes these changes, they may be onto something.
Do yourself a favor: do not disregard Live/MSN. Yahoo is giving up market share and MSN/LIVE is getting it.
Yahoo has its own webmaster tools, they call it site explorer. In theory it is to be similar to Google’s webmaster tools, although it is a half hearted attempt and really does not offer you much insight into your websites stats within Yahoo.
Yahoo site explorer does offer the opportunity to submit a sitemap to Yahoo, and to review what and how many pages from your site are actually indexed (within the Yahoo search results). It will also display your external and internal links, and this is one spot where Yahoo really shines. Yahoo displays all of your back links (links from external sites). Unlike (Google “traditionally’ only shows a certain percentage of your back links), your Yahoo back links can also be exported to excel so you can thumb through them and sort them efficiently. Lastly Yahoo does provide you the opportunity to remove pages or entire websites from its index which can be handy if Slurp (Yahoo’s search engine spiders name) indexes pages that you did not want indexed.
Yahoo has a signifgant amount of work to do to gets its version of webmaster tools up to snuff: offer the basic search results for your site and web site ranking would be the bare minimum.
Google offers something called Webmaster Tools. Do not be alarmed, this is not anything technical that the average Joe could not figure out. It is actually quite simple. You simply need a Google account, which most people already have; here you can set up a webmaster tools account. Google’s webmaster tools is by far the best of the Big 3 Yahoo, MSN/Live also have their own versions of webmaster tools.
Initially what you need to do is add a Meta tag to your site. Google will generate a Meta tag for you, all you’re required to do is cut and paste this information above the body section of your html right below the <head> section. In doing this you’re letting Google know that your the owner of this domain (if you use a WordPress blog like this, they alternatively offer you the ability to upload an html file, which is simply a blank web page that is named Google80948478734876 example only, do not use this). Once you have proved to the all mighty Google that you are the website or domain owner or webmaster, they will start to show you lots of cool information about your site and about how Google see’s your website.
Once you in here make sure you read my earlier post on sitemap creation and submission. If you have not yet submitted your sitemap to Google, please click on the sitemaps icon, and add it so Google knows about every single page in your site and will in good time index most every page, this is very important for long tail search terms.
OK back to webmaster tools. Once the Meta tag or html file is uploaded, you will need to verify this, this is instantaneous. Google will confirm this, now Google will track lots of cool things, like how many and what pages from your site are indexed (with the Google search results), what the top search queries for your website are, and where you rank in the Google index for various search queries. It will also show you when the last time Google visited your home page, if Googlebot (the name of Google spider) encountered any issues indexing your site, show you a list of all internal links (links within your website that exist) and how many external links your site has (links from other websites to yours).
Lastly Google webmaster tools offers a tools section where you can review your robots.txt file (topic for another day). You can generate a robots.txt file review Google’s crawl rates etc. Lots of neat eye candy. What I look at often is how many pages are within Google’s index, how often Googlebot visits my site and what the top search queries are and in what position my site appears.
I was rebuilding site maps today, updating them and ensuring they are all (this can be labor intensive with 50+ websites) in the current XML industry standard format so all search engine spiders can index your website and do it effectively. You can find additional information about site maps and the latest industry protocol sitemaps.org
You need to make sure that when creating your sitemaps that they are in .xml format; this is the preferred method of acceptance across all search engines. Be sure to submit your sitemaps to the big three, Google, MSN (aka Live.com) and Yahoo. Submitting your sitemap to the big three search engine is one of the first steps in getting your websites search engine optimization ramped up. Adding a sitemap to the search engines is the telling the search engine spiders everything there is to know about your website, its structure and what pages are available for indexing.
It can be a little annoying setting up three accounts across all three search engines, but it will pay big dividends. You can use the features available in each search engines webmaster tools for various SEO projects, and you should visit these sites relatively often. I will get into the various versions of webmaster tools at a later date.
Stop doing whatever it is your doing, and make sure you have current xml sitemaps submitted to the big three search engines.
I woke up this morning to find out that Google had seriously altered the path of my weekend, two of my clients pages dropped in search. Man, why is it that Google has to play big brother? Google created a good system for ranking websites based upon numerous criterion, based upon meta info, page content, website architecture, links (both internal and external) and numerous other off site components. The problem is that smart people have figured out what Google does to rank websites, and they are exposing it. Now, instead of Google constantly working to improve its method, it has decided to simply become the enforcer of its territory. They invented numerous technologically advanced theories 10 years ago, and have basically become the fat cats and not continued to expand on their technology. There are numerous things they could do to improve their search results, but they are now a public company with billions of dollars in profit and are spending every resource possible to ensure that their profits and revenues continue to soar.
People involved in search engine optimization (aka SEO) are constantly watching, reading and sharing everything they figure out about search engine spiders, how they act, and what works and does not work. The internet (the vehicle in which Google has used to reach fame and fortune) is the ultimate tool for real time communication, the internet is what people involved in search engine optimization use to share insights and theory about search engine spiders, they post these thoughts and ideas across the net, on blogs, forums, wikis and email. Amazing how the internet (Google’s Playground) has become its biggest hurdle for future growth …funny how that works.