Zoekmachine marketing artikelen

Many of you have read with enthusiasm my article entitled Search engine optimization guide and believe if you follow the guidelines within the article and then submit to hundreds of search engines you will be flooded with visitors and prospects eager to buy your product.

door: David Callan

Only the big search engines count

That might be true, but you can bet your bottom dollar that only the big search engines are generating any significant number of visitors for you. By big search engines I refer to Google, Yahoo and other popular engines. The truth is that you have just wasted your time submitting to hundreds of search engines, because only a small percentage (and I mean small) of them will send you any decent monthly traffic. They might claim to be visited lots of times but that's mostly other webmasters submitting their URL and not actually using the search engine to find websites of interest to them.

In fact there are currently hundreds of search engines out there, most of them are not much more than an advanced FFA page. There's only one way I would submit to these and that's with good automated submittal software, just to clarify that a bit more - I would use automated software for the less important sites, but I would always hand submit to the top 15 or so engines.

So once again beware of ads such as 'submit to the top 500 search engines for only $99'. I say this because generally only the top 10 will drive traffic to your site. It doesn't take much time to manually submit to these top engines and your $99 could better be spent on Overture.com, Google Adwords, buying ads in ezines or on any of the other marketing methods which are available to the modern Internet marketing professional. At the moment the top ten search sites - meaning both directories and search engines account for just over 93% of all search engine referred traffic going to websites. The other 6.something% is made up of hundreds of sites claiming to be search engines.

Even at that the 11th - 15th biggest search engines make up most of that figure. What are the search sites you need to concentrate on as a webmaster looking to drive proper traffic to your site? You might ask. Well here's the sites I consider the most important. This list is of course open to interpretation, but here's my opinion anyhow. A point to note is that the first five sites on the list are ordered by importance to webmasters and not necessarily by the amount of traffic the individual site can send to a website.


The most popular search engine by far, a good ranking for two or three of your keywords here could result in a lot more monthly traffic for your site. Good ranking can be achieved by distributing your keywords frequently throughout the important places of your pages and by having lots of incoming links. My article Google ranking tips should really help you here.


Yahoo although losing ground to Google is still a very popular place for surfers to search on. Its search results are provided by Google so in all but a few cases good rankings in Google means good rankings in Yahoo. The Yahoo directory is still very popular with surfers so check out Yahoo submitting tips for the lowdown on getting into their index.

dmoz.org (ODP)

Dmoz.org itself will not send you much traffic, I have ranked it the third most important search site anyhow as its true importance lies in the fact that lots of third party sites including Google use ODP data on their sites. What this means is that if your listed on dmoz.org you will soon notice your site is being found on sites such as the Google Directory, AOL search, Netscape Search and Lycos to name just a few. Open Directory Project guide is a very detailed guide written by me on getting into the ODP successfully.


Just like dmoz.org's main attraction is that it supplies data to many third party sites so too is Looksmarts. Their networks claims to reach vast numbers of people. However your main advantage of listing with Looksmart stems from the fact that MSN use their data prominently within results for searches conducted on MSN.com, MSN.co.uk and other international MSN's. If you're a non commercial site you can get into looksmart via zeal.com.

alltheweb.com (Fast)

Alltheweb.com is a very popular search engine which is currently seeing much growth. Link popularity is a factor on this engine so get them links, also having keywords within <>h1 and h2 tags seems to be quite important here.

Search engine optimization guide.

Not by any sense of the word is search engine marketing a secret. However there are secrets and tricks to search engine marketing. In this article we'll try to successfully educate you on the different factors which contribute to SEO, SEO is becoming a 'Hot Word' in Internet marketing In case you don't already know, SEO stands for Search Engine Optimization. It is a vital part of any online marketers quest for success.

It means preparing your website with the 'right ingredients' for the search engines to 'like' it. If they 'like' your site you can expect to get hundreds if not thousands of free visitors from them. In fact some research has found that at least 65% of traffic to websites comes from search engines. This is just a general figure and of course varies depending on the other marketing methods you employ, however this figure does give you an indication of how important SEO is to any online business.

OK now we'll go through the different factors which affect your sites ranking. First things first!

Choosing the right Keyphrases.

With most of the big 'useful' engines now operating express submission services you will want to spend your dollars wisely. Do You want:
· Highly targeted visitors?
· Very good visitor to sale ratios?
· Lots of profits?

If you answered yes to these questions, then forget about choosing the rights keywords, realize the benefit of keyphrases. This approach is likely to generate much higher click through of highly qualified visitors.

Let me explain this to you - How many people do you think search for the word music? ''A lot'' - How many results come up for music ''A lot''. What do you think your chances of appearing near the top are, well I'll answer that one for you. SLIM TO VERY SLIM. Not to worry though because visitors who click-through to a site after searching for music are not highly targeted, you don't know if they're searching for: 1. music tickets
2. music lessons
3. music CDs
4. music related equipment
5. music software
6. music news
7. etc., etc., etc.

It's so vague that your time and effort would be better spent on other methods. However if a search engine visitor types in 'Special Offer Music CDs' and you have targeted that keyphrase in prominent places where search engines look on your website, you have a much better chance of appearing near the top of the pile. Also if in fact the visitor clicks through on your listing, you are much more likely to make a sale, because you have exactly what the searcher was looking to buy


The more complex your keywords are the more highly qualified visitors you are going to get from the search engines.
Qualified visitors = Sales! Sales! Sales
Try also to regionalize your keyphrases if possible. This will of course only apply to certain websites. What I mean by this is that if you operate out of a specific location target that location and the surrounding areas in some of your keyphrases. This helps to improve the quality of your visitors.

Example: Imagine that the biggest & best car dealer in Detroit happens to have a website, but they're unwise and their title tag is as follows:
#1 Car Sales Dealer and Garage in America

They think this will bring them lots of visitors because it has an alphabetically high start (#1) and has decent keywords, they're probably right. The visitors it does bring however will not exactly be targeted. A visitor from the other side of America could visit their website, are they going to travel across America to buy a car from someone just because they visited their website? No.

Now let's say a man from Detroit is looking to buy a new car. The dealer he bought his two previous cars from has closed down and he doesn't know of any other dealers around. He therefore decides to use the Internet to look for "Detroit car dealers". The likelihood is that the above car dealers competition will show up because he or she has used a title like this: #1 Detroit car dealer, garage and sales website. Obviously the man looking to buy a car in Detroit is going to be interested in the site. It's for these reasons that if it's appropriate for your business I recommend that you always regionalize with your keyphrases.

Imagine you don't know what words to target in your keyphrases, well there's a simple solution. Visit wordtracker.com and do their free trial. Enter in a word that you think people searching for your site will enter on the search engines and wordtracker will give you a list of 15 related words and keyphrases. These are what you'll use to make your keyphrases. Now you should know what keyphrases you want to rank well on in the search engines, but where do you put these on your website pages to improve you chances of appearing near the top of the results.

Search engine toolbar guide.

Search engines, do you use them? Of course you do! Most if not all internet users use them at some stage or another, and why not, they are very easy to use, simply go to the address of the engine be it Google.com, Alltheweb.com or any other engine and enter keywords relating to what you want to find and viola! However many engines now provide an even easier, more direct way to access their databases through the use of small applications known as toolbars. These search engine toolbars provide extra search facilities and surfing facilities usually from within Internet Explorer but some toolbars now have Netscape alternatives lest with limited capabilities.

These extra search facilities will allow you to conduct a search from your browser without actually having to visit the search engines site itself. The toolbar will automatically bring you to the engines results page and you can take it from there. When I speak of extra surf facilities I'm mainly speaking of certain features which aid web surfing such as the Google toolbars latest feature, a pop-up ad blocker. I like most people find pop-up ads very very annoying, so this is really good feature.

These features of toolbars are very handy for the ordinary surfer, but what about us guys and gals - the webmasters that make the web happen. Do any of these so called toolbars have elements within them that will help us get more visitors, more profit and hence more success? Yes, they do. This article is your guide to using the two main toolbars available on the net from Alexa and Google to your advantage while promoting and running your site. A note before we start, the Alexa toolbar search feature is powered by Google and not by Alexa as one might expect, Alexa simply provides extra information in such a way that webmasters will find very useful and helpful.

Alexa toolbar.

Admittedly when I first downloaded the Alexa toolbar, I was confused over this ranking figure which is displayed on the center of the toolbar. The confusion stemmed from the fact that I was unsure whether a better and more popular site would have a larger ranking number representing it in the Alexa site database or indeed a smaller one. The former seemed more right initially as I presumed this was related to the number of page views Alexa users had given a particular site over a certain time span, so the bigger the better.

However in my quest for the truth I hit Google and Yahoo to search for a definitive guide on Alexas toolbar, while scanning the results on Yahoo I noticed that its Alexa ranking figure was one, intrigued I continued to the Google.com site where I seen a ranking of five. I knew that these two sites where immensely popular so obviously the lower the figure the better. The figure relates to a sites popularity with Alexa users. Yahoo's figure meant that it was currently the most popular internet destination with Alexa users and hence Google was currently the fifth most popular site with Alexa users.

This provides webmasters with a great insight to the popularity of a website, which can be used to determine sites that are worthwhile link exchange partners or worthwhile to spend your advertising dollars on. If you've read my articles on reciprocal linking you will remember me saying that it's better to 'link up' with a more popular site rather than 'link down' with a less popular site. Alexa can help you to always 'link up', simply visit your own site and check your ranking and then search for sites related to yours but with a better ranking.

Using Alexa over the Google toolbar pagerank feature to locate good link partners has the advantage of not being search engine based. That is Alexa provides a ranking based on the popularity of a site based on actual visits and not incoming links like the pagerank system is based. I prefer to use Alexa myself because I know the benefits of the free long term traffic that can come from reciprocal links alone, discarding the fact that they can help your ranking on Google and other engines.

Other tools included on the Alexa toolbar include the ability to view contact information for a site so you can contact them directly for a link exchange proposal and backwards links pointing to a site, so you can see who your competitors exchange links with and hence ask them to exchange with your site too. The Alexa toolbar is available free of charge from alexa.com . Currently however no Netscape or Opera versions are available.

Yahoo submitting tips.

Getting listed on Yahoo should be without doubt one of the most important missions on any Internet marketers mind. Yahoo is the biggest of all the search engines, well actually that's not true, you see Yahoo is not technically a search ENGINE it is a human compiled directory of websites that does not have a spiderbot going to sites and indexing them. However for the sake of this article when I say search engines I am referring to all 'search sites'

Did you know that recent estimates show that Yahoo is currently capturing an amazing 40% of all search engine traffic online? Do you know what this means? It means that almost 1 out of every 2 people that do searches on the Internet use Yahoo, everyone on the Internet has done searches some time or another.Nothing in the world should be clearer to anyone with a website whose just after reading the above figures. You need your site listed in Yahoo. ASAP.

To recap if your site is not listed in the Yahoo index you're losing lots of potential customers to competitors that are listed in Yahoo. With the amount of visitors Yahoo can send you even with an average listing this could amount to hundreds, maybe thousands of dollars worth of lost profit. Lost profit is never good in business. A point to note here is that being listed under webpages does not mean you're listed in Yahoo, it in fact means you're listed with Google because Yahoo gets webpage results from Google. You now know how important it is to be listed in Yahoo. Let's move on to the good stuff, the main body of this Yahoo submitting tips article - how to submit to Yahoo.

First you've to determine the scope of your site. Is your site commercial or noncommercial? Commercial sites that want to be listed by Yahoo now must use "Business Express" when submitting. This service previously cost a once off figure of $199, however the price has since risen to $299 a year. "Business Express" what is it? Good question and one that many people ask, well it's basically the same as free submit except that with business express your site is guaranteed to be reviewed within a week's time. Please be aware however that it does not guarantee that your site will be accepted and added to the Yahoo index, it simply guarantee's a timely review of your application.

If your site is rejected Yahoo will allow you to appeal for free within a certain time scale (usually 30 days) of being informed of your rejection. Yahoo staff usually include in your rejection the reasons you have been rejected. You should examine these reasons and fix any problems then resubmit after a week or so. If your site is a noncommercial entity, you'll still be able to submit for free, but a review could take as long as 8 weeks or might never happen at all. Before you submit make sure your site is 100% ready, under construction pages need not apply to Yahoo because they're not going to get in. Your site should be aesthetically pleasing to the reviewer, be quick loading and of course should have lots of good content.

Google ranking tips.

Google is by far the most popular search engine available today for both ordinary surfers and webmasters alike. Surfers like it because of the highly relevant results it gives and the speed at which it delivers them. This is due to its complex text matching algorithm and of course the Pagerank™ system that this engine uses. More on the Pagerank™ system later.

Google is popular with webmasters and Internet marketing companies due to the highly workable ranking system it uses. Unlike other engines where information about how the results are obtained are sketchy at best, Google actually publishes information on its site about the results it produces. Hence webmasters have things they can do to produce higher rankings.

What also makes Google popular with webmasters is the speed at which they will spider and list your site. If you're not listed in Google and submit your URL you're usually indexed within two weeks. If however your site is already listed in the index Google should reindex once every month, but more frequently if you've a high Pagerank™.

This indexing and reindexing time is much quicker than most other search engines. This allows webmasters to edit their pages properties such as title, first few lines of text, headings, keyword distribution and of course the number of incoming links to their site. They can then discover quickly if the changes they made were successful or not. It's because of this popularity that you need to know the workings of the Google search engine. Without knowledge of it you'll be ranked lower than all other sites that are only slightly familiar with the Google algorithm and hence could lose lots of potential customers.

Google ranking algorithm.

Let's now continue onto the main part of this Google rankings report by indulging ourselves in the Google ranking algorithm. Well there are two main parts to the algorithm Google uses, the first is its text matching system whereby Google tries to find pages relevant to what the searcher has entered in the search box. The second and equally important part of the algorithm is of course the Google patented Pagerank™ system. I'll first go through how to make your pages relevant by discussing the text matching part of the algorithm. Google gives a lot of "weight" to the title tag when searching for keywords. It is therefore vital to make sure your most important keywords or keyphrases appear within this tag. It seems to work best if you've other words in your title tag too after your keywords, but try to remain under 35-40 characters

I imagine many of you know this already but Google does not use meta tags such as the keywords meta tag or the description meta tag. This is because the text within these tags can't be seen by visitors to a website. Therefore Google feels these tags will be abused by webmasters placing lots of unrelated words in them in order to get more visitors. This lack of support for meta tags means that Google creates your description from the first few lines of text on your page. This in turn means that you've to have your keywords and phrases right at the top of your webpage, if Google finds them your page becomes more relevant, if however it doesn't find them the rest of your page has to work harder to become relevant. To see an example of what I mean scroll back to the top of this page and you'll notice keyword rich wording similar to:

Google submitting tips, ranking high at google.com, Google ranking tips, pagerank algorithm, Google algorithm guide.
The above text includes keywords and keyphrases related to the theme of this page. Now when people search for any of those keywords or keyphrases this page is much more likely to be near the top of the results than a page that doesn't imply this technique. Google considers keyword density in the body of a page for determining relevancy too, so make sure your keywords and phrases appear a couple of times throughout the whole page. Don't go overboard though, a density of 6-10% seems to work best.

Google has recently been noticed to give a substantial amount of "weight" to words appearing between the various header tags. These are tags designed to help you split up sections of your page, so this approach by Google seems to make sense. The header tags go from <>h6 the smallest to <>h1 the biggest, the bigger the heading tag the more relevent your page will become for the words within it. It is for this reason that you should always try to have your most important words within these tags as often as possible throughout your page. Other advice about making your page relevant would be to make as many keywords appear within bold <>b tags as you can. In the past Google has been known to index text in alt image tags, whether they still do or not I don't know but it couldn't hurt to include keywords in these tags anyway.

Google Adwords guide.

The year is 2000, Google is seen as the leader in the search engine industry by now. Many of Googles competitors are trying their hands at different advertising models as a way to generate revenue. Google currently seeing the most growth of them all saw the potential it had as an advertising medium and therefore was sure to follow suite sooner or later. It did so with the launch of a keyword-targeted advertising program aimed more towards bigger companies. However it was not until later in the year when Google launched the Google Adwords program that they became a mainstream player available to even the smallest of businesses.

The original Adwords program worked well enough, however it worked on the basis of payment by impressions which didn't guarantee the advertiser a single click so in February 2002 it received a major overhaul with the introduction of the Google Adwords Select program (nowadays it's usually just known as Google Adwords as the original program has been discontinued).

What is Google Adwords?

Adwords is Googles version of the pay-per-click advertising model. It allows you to display ads which link directly to your website when searches are done for your chosen keywords or keyphrases. These ads are located to the right of the results which Google gives you for a search and they're also displayed on Googles many partner sites which include AOL, Earthlink, HowStuffWorks and blogger. Recently with the launch of Googles Adsense program your ads could also be displayed on websites related to your keywords.

When you create a Google Adwords ad, you choose keywords for which your ad will appear and specify the maximum amount you're willing to pay for each click. Remember Googles Adwords program uses a PPC model so you only pay when someone actually clicks on your ad and hence visits your website. Adwords enables you to save money as its program Discounter automatically reduces the actual cost per click you pay to the lowest cost needed ($0.01 above competition) to maintain your ads position on the results page.

Google is competing well in this arena, in fact they now dominate the market, pulling more advertisers and revenue than former industry leader Overture.com does. I don't know how long this will last though as Yahoo INC! has just bought Overture. What has Yahoo got up its sleeve?

Advantages of the Google Adwords program

Just as the popularity of Googles search engine is derived from its strong technologically advanced features and results so too is its advertising program Adwords. Google Adwords has many advantages over similar programs such as Overture.com and Findwhat.com. One of these has been mentioned already, it's the Adwords Discounter feature which will lower your cost per click price to one cent above your nearest competitor to allow to stay ahead of his or her ad. This means that you don't have to be constantly checking if your competitors have lowered their bids in order for you to minimize your price, Google does this for you.

The way Google Adwords positions your ads is also another great advantage of the program. In Adwords the position of a certain ad is determined by multiplying your CPC (cost per click) by your CTR (click through rate) and not simply by CPC alone as this would allow the big fish to win all the time. Googles stipulation that your ads must have a CTR of at least .05% means that a company with deep pockets simply can't outbid the competition. They also have to outwit them by using good ad copy and appropriate keywords. Even if your competition is willing to pay sky high prices for clicks this still won't save them, as if they can't write good pulling ads they will be dropped from the program, leaving you to move up a position.

Other advantages which Googles program has over similar ones include setup time and specific country / language targeting. With Adwords your ads can be live on Google within five minutes of creating them so you can potentially begin to see results immediately, ads on Overture usually go live after a three to five day waiting period. Adwords allows you to choose who should see your ads from among 250+ countries and 14 languages, this means you have more control over your ads so you can be sure they're only shown to a highly targeted audience which means your more likely to be successful.

How to profit with Google Adwords

Now you know why Google Adwords is such a good thing, let's move onto how to actually use it in order for your business to make profit. First things first, you should determine how much you can afford to pay for a click. Doing this is important as it enables you to better understand the amount of money you can bid on keywords in Adwords while still remaining profitable. To do this your conversion ratio is needed, calculate your conversion ratio by dividing your monthly unique visitors by your monthly sales, then convert your answer into a percentage by multiplying by 100.

Imagine in a month you get 20000 visitors and sell 500 products each with a gross profit for you of $50. Your conversion ratio simply put is (500/20000)*100 = 2.5%. This means that for every 100 people who visit your site 2.5 buy your product. Your gross profit per 100 visitors is calculated by multiply the gross profit on your product by your conversion ratio, to continue with the previous example - $50 x 2.5 = $125. Divide your gross profit per 100 visitors figure by 100 to determine how much you can bid in Adwords.

In this case you could afford to pay up to $1.25 for a visitor and still break even. Rarely will you have to pay this much for a click, remember that the minimum CPC on Google Adwords is only 5 cent so play your cards right and you can have high profits.
Pay per click search engine guide

What are pay per click search engines?

Pay per click search engines (also called pay-for-performance and paid listings among other names) are engines which allow site owners to determine their sites ranking in that particular search engines results by bidding on keywords. Usually the first three to five search results are used by a network of partner search sites. The result for the site owner or webmaster is a lot more highly targeted traffic and a lot more sales.

The underlying idea to pay per click search engines is that you find keywords related to your website and then you bid on these keywords and buy high positions on your chosen engine. Your bid amount is the amount you're willing to pay for each visitor that visits your site through the search results. The more times a word or phrase has been searched for the higher you'll have to bid to get high rankings. The only limitation is that your site must be at least vaguely relevant to the keyword you want to bid on. These engines allow you to skip all the search engine optimization stuff and simply pay for visitors.

Pay per click search engines have a number of benefits to any webmaster

You only pay for visitors. Unlike banner advertising where you've to pay each time someone sees your ad pay-per-click engines only charge you when someone actually clicks on your listing. This means you're getting guaranteed visits.

Pay-per-click engines provide highly targeted cheap visitors. Often you can buy a good ranking on a decent keyword for as little as 1 cent or 2 cents per click. Overture has of late installed $.05 as the minimum bid however. Popular search terms can cost much more on the big pay per click search engines most notably Google Adwords, Overture.com and Findwhat.com. Even still PPC engines are one of the most cost effective way of driving targeted people to your site.

Hopefully you'll agree now that using pay-per-click search engines is a great way to increase traffic and profit. Let's now talk about how to use Overture.com, Findwhat.com and other PPC search engines to make to most of your money. A point to note before we continue, Google Adwords although similar in many ways has some fundamental differences to Overture, Findwhat and other PPC engines it therefore has been covered in its own article entitled Google Adwords guide. That's not to say however that many of the techniques found within this article can't be applied to Adwords, I'm simply saying refer to the Adwords article for the definitive Adwords guide.

Relevant terms are the ones that'll bring you the highest quality traffic. Basically this means only bid on terms that are directly related to your site. Imagine for example if akamarketing.com decides to use Overture or any other PPC search engine in the future, it would be bidding on terms such as "Internet marketing articles" and "website promotion articles" because they're the main focus of the site. Now imagine if you bid for terms that weren't really directly related to your website, the people that come from these terms are not likely to buy or sign up but you've still to pay for them. It's like giving the PPC search engines free money, therefore it's vital to always stay relevant.

Search engine cloaking and stealth technology

Tired of the search engine optimization game? Lots of webmasters are, today the Internet is more a big shop than the information library it became so popular for. This of course means that there are hundreds if not thousands of sites competing for the same customers.

Search engines play a very big part in whether company A or company B gets a visitor and potential customer. Webmasters and Internet marketers know this and hence competition for search engine traffic is fierce. These days it's almost impossible to keep up with the search engines, one day your site could be near the top the next day your competition could be there and you could be gone from the results completely. One particular method however is being used by webmasters to enable their sites to rank high and stay high. The method is highly controversial and risky. It's called search engine cloaking.

What is search engine cloaking?

Search engine cloaking is a technique used by webmasters to enable them to get an advantage over other websites. It works on the idea that a 'fake' page is delivered to the various search engine spiders and robots while the real page is delivered to real human visitors. In other words browsers such as Internet Explorer, Netscape and Opera are served one page and spiders visiting the same address are served a different page.

The page the spider will see is a bare bones HTML page optimized for the search engines. It won't look pretty but will be configured exactly the way the search engines want it to be for it to be ranked high. These 'ghost pages' are never actually seen by any real person except for the webmasters that created them of course. When real people visit a site using cloaking the cloaking technology which is usually based on Perl/CGI will send them to the real page that look's good and is just a regular webpage.

The search engine cloaking technology is able to tell the difference between a human and spider because it knows the spiders IP address. No IP address is the same so when an IP address visits a site which is using cloaking the script will compare the IP address with the IP addresses in its list of search engine IP's. If there's a match the script knows that it's a search engine visiting and sends out the bare bones HTML page setup for nothing but high rankings.

Once a list of all the search engines spiders IP addresses have been stored, it's simply a case of writing a script that says something like:
If IP request = google(Spider IP) then show googlepage.html
If IP request = unknown (other user) then show index.html

This means that when the Google spider comes to visit a site, it'll be shown a page that is optimized with keywords, heading tags and optimized content. Since the optimized page is never seen by a casual user design is not an important issue. When a user comes to the site the server performs the same check and finding that the IP address does not match any in its list shows the standard page.

Search engine cloaking is also a great way of protecting the source code that's enabling you to rank high on the search engines. Ever read a search engine ranking tutorial that recommends you to model your keyword density, layout, etc on pages that are already high ranking? Well technically that's stealing and your competition might want to do it to you some day. With search engine cloaking however you can protect your code because when your competition visits they'll be sent to the regular page and not the page that's giving you those precious good rankings.

Different types of search engine cloaking

There are two types of cloaking, the first is called User Agent Cloaking and the second is called IP Based Cloaking which we've already discussed above. IP based cloaking is the best method as IP addresses are very hard to fake, meaning your competition won't be able to pretend to be any of the search engines in order to steal your code. User Agent Cloaking is similar to IP cloaking in the sense that the cloaking script compares the User Agent text string which is sent when a page is requested with its list of search engine User Agent names and then serves the appropriate page.

The problem with User Agent cloaking is that Agent names can be easily faked. Imagine Google introducing a new anti-spam method to beat cloakers, all they need to do is fake their name and pretend they're a normal person using Internet explorer or Netscape, the cloaking software will take Googles bot to the non optimized page and hence your search engine rankings will suffer. User Agent cloaking is much more riskier than IP based cloaking and it's not recommended.


Search engine cloaking isn't as effective as it used to be. This is because the search engines are becoming increasingly aware of the different cloaking techniques being used by webmasters and hence they're gradually introducing more sophisticated technology to combat them. In saying that though cloaking can still benefit your search engine rankings, it's just a matter of being very careful. I would recommend you read my SEO tutorial entitled Search engine optimization guide and try regular search engine optimization first, if after a few months you're still not seeing good results then you should at least consider using cloaking technology to improve your rankings.

Open Directory Project guide

The Open Directory Project (ODP) based at dmoz.org is perhaps the most important directory any webmaster can submit a site to these days. Even more important than Yahoo? You might ask. Well yes in my opinion anyway, you see the ODP, formally known as NewHoo not only operates its directory from its headquarters at dmoz.org but also supplies its directory data to such big player engines as Google, AOL search, Netscape Search, Lycos and Hotbot to name just a few. In addition to these 'mainstream' engines hundreds of other sites also use ODP's data.

This means that a listing in the ODP directory will enable your site to show up for searches done on any engines or sites which use the Open Directory Projects data, provided of course your title and description are optimized for the searched terms. The fact that lots of third party search engines and sites use ODP data is the main attraction of applying for a listing with them. The main attraction is not as one might think to be found at searches done at dmoz.org. Dmoz.org itself receives only a tiny percentage of the traffic that sites such as Google and Lycos receive.

In particular, a listing in the ODP directory can be very advantageous for sites wishing to rank high in the Google search engine. Google not only uses the Open Directory Projects data for the Google Directory located at directory.google.com but it also 'mixes' the data with its own to determine where sites should be ranked in the search results. A listing in the ODP will help boost Googles view of how important your site is (ie. boost your Pagerank) and hence help to increase your ranking for your chosen keywords and key phrases. Presumably you now know that getting listed in the ODP is very important. Getting a listing is not hard, it does sometime take a little time but it's not hard and what's more it's completely free. This article is your guide to submitting to the Open Directory Project.

A few pre requirements

The Open Directory Project (ODP) based at dmoz.org is perhaps the most important directory any webmaster can submit a site to these days. Even more important than Yahoo? You might ask. Well yes in my opinion anyway, you see the ODP, formally known as NewHoo not only operates its directory from its headquarters at dmoz.org but also supplies its directory data to such big player engines as Google, AOL search, Netscape Search, Lycos and Hotbot to name just a few. In addition to these 'mainstream' engines hundreds of other sites also use ODP's data.

Before I continue on and tell you how to submit your site I'm first going to tell you not to bother.. IF your site doesn't meet a few pre-requirements that is. I believe these pre-requirements to be very important to the ODP and almost all other directories. Firstly your site must be finished. That means no fancy 'under-construction' graphics with smiling builders waving their hammers back and forth on them, ODP editors don't care if you know how to use a free animated GIF's directory, either do I for that fact. That means no broken links, how do you expect an editor to review your site if he or she can't first view it. That means fast or average page loading times, editors are busy editing their chosen category, they don't have the time or the patience to wait for an eternity to see your site load.

Secondly your site should be unique and contain useful content. This means that your site should not be a mirror site with the exact same content as another site listed already within the directory. This means that your site should not be simply an affiliate farm designed specifically to promote products of other companies. Thirdly and finally, your site should not be an illegal underground type site, ODP editors will simply move on to the next submission if they come across a site like this. Remember the ODP is a directory just like Yahoo and Looksmart so real people will visit your site, these people are experts in their chosen fields and can spot quality sites when they see them. If your site is of poor quality then I'm sorry but your rejected – "Next!!!"

Robots.txt file guide

We all know search engine optimization is a tricky business, sometimes we rank well on one engine for a particular keyphrase and assume that all search engines will like our pages and hence we'll rank well for that keyphrase on a number of engines. Unfortunately this is rarely the case. All the major search engines differ somewhat, so what get you ranked high on one engine may actually help to lower your ranking on another engine.

It's for this reason that some people like to optimize pages for each particular search engine. Usually these pages would only be slightly different but this slight difference could make all the difference when it comes to ranking high, however because search engine spiders crawl through sites indexing every page they can find they might come across your search engine specific optimized pages and notice that they're very similar. Hence the spiders may think you're spamming and will do one of two things, ban your site altogether or severely punish you in the form of lower rankings.

What can you do to say stop Google indexing pages that are meant for Altavista, well the solution is really quite simple and I'm surprised that more webmasters who do optimize for each search engine don't use it more. It's done using a robots.txt file which resides on your webspace. A Robots.txt file is a vital part of any webmasters battle against getting banned or punished by the search engines if he or she designs different pages for different search engines.

The robots.txt file is just a simple text file as the file extension suggests. It's created using a simple text editor like Notepad or Wordpad, complicated word processors such as Microsoft Word will only corrupt the file.
Here's the code you need to insert into the file:

Red text is compulsory and never changes while the blue text you'll have to change to suit the file and the engine which you want to avoid it.
User-Agent: (Spider Name)
Disallow: (File Name)

The User-Agent is the name of the search engines spider and Disallow is the name of the file that you don't want that spider to spider. I'm not entirely sure if the code is case sensitive or not but I do know that the code above works, so just to be sure check that the U and A are in caps and likewise the D in disallow.

You've to start a new batch of code for each engine, but if you want to list multiply disallow files you can one under another. For example -
User-Agent: Slurp (Inktomi's spider)
Disallow: internet-marketing-gg.html
Disallow: internet-marketing-al.html
Disallow: advertising-secrets-gg.html
Disallow: advertising-secrets-al.html

In the above code I have disallowed Inktomi to spider two pages optimized for Google (internet-marketing-gg.html & advertising-secrets-gg.html) and two pages optimized for Altavista (internet-marketing-al.html & advertising-secrets-al.html). If Inktomi were allowed to spider these pages as well as the pages specifically made for Inktomi, I run the risk of being banned or penalized so it's always a good idea to use a robots.txt file.

mentioned earlier that the robots.txt file resides on your webspace, but where on your webspace? The root directory that's where, if you upload your file to sub-directories it won't work. If you want to block certain engines from certain files that do not reside in your root directory you simply need to point to the right directory and then list the file as normal, for example -
User-Agent: Slurp (Inktomi's spider)
Disallow: folder/internet-marketing-gg.html
Disallow: folder/internet-marketing-al.html

If you wanted to disallow all engines from indexing a file you simply use the * character where the engines name would usually be. However beware that the * character won't work on the Disallow line. Here's the names of a few of the big engines, do a search for 'search engine user agent names' on Google to find more.
Excite - ArchitextSpider
Altavista - Scooter
Lycos - Lycos_Spider_(T-Rex)
Google - Googlebot

Alltheweb – FAST-WebCrawler/

Be sure to check over the file before uploading it, as you may have made a simple mistake which could mean your pages are indexed by engines you don't want to index them, or even worse none of your pages mightn't be indexed. A little note before I go, I have listed the User-Agent names of a few of the big search engines, but in reality it's not worth creating different pages for more than 6-7 search engines. It's very time consuming and results would be similar to those if you created different pages for only the top five, more is not always best. Now you know how to make a robots.txt file to stop you from getting banned by the search engines. Wasn't that easy, till next time.

All links are not created equal

door: Terry Mickelson

In the past year several search engines have announced that links are a part of the criteria they use to rank sites. This has caused a great amount of confusion. There are two types of links. Inbound and outbound. Within your site it is important to link pages together. This creates an inbound link from page to page and an outbound link from page to page. Pages that do not have even one link to them are called orphan pages. Some search engines like Google will not index orphan pages.

When a search engine crawls through a page they look for and follow each link indexing the location and the topic or theme of the page. This explains why sometimes search engines have pages indexed that were not submitted to them. They just find them on their own. Not all links can be followed by all of the search engines. For example, many sites use Java to create "rollovers" - images that change when a visitors mouse is placed over it to create links from page to page. Typically, search engines cannot follow these links. This is also true when an image map is used to create links or site navigation. Unless the search engines can follow the links on your pages they will never find the deep pages of your site.

When creating a linking plan for your site it is important to understand that not all pages are created equal. For example, if your site is a comparison-shopping site it will have many product lines. Each category of products is grouped together. In one section, camping for example, you may find tents, sleeping bags and coolers. Another section may be house wares. In this section you will find linens, pots and pans and knives. It is important that each page within each section be linked together. This creates a theme within each section of the site. Each link should use a word that describes the theme of the page it is linking to. This theme allows the search engines to group sections of your site together and catalog them all together making the house wares section more relevant when a search is done for linens than the camping section. The only thing that each section has in common with one another is that they have products for sale.

Inbound links from external web sites are very important to your search engine success. Some people think that the more links they have to there site the better. They think that the more inbound links the more popular your site must be. However the search engines are smarter than this and understand that more links does not mean more popular. It just means more links. Because of this several of the top engines actually penalize sites for belonging to "link exchanges". Link exchanges are simply a place for sites to exchange links. In most cases all they have in common is that they have a link from one another.

Search engines are looking for inbound links from popular relevant sites with the same theme just like they look for relevant links with the same "theme" within the site. A perfect theme will have the main keyword or phrase on your site the same as the main keyword or phrase on the sites that link to you. You can check the theme of the sites that link to you by going to http://www.searchengineworld.com/cgi-bin/theme.cgi and running a report. This report shows what the major keywords are on the sites that link to you.

It is important to get your site listed in the proper section of Open Directory and Yahoo. Search engines know that both of these directories use human editors to approve sites and that if it is listed in the directories it contains valuable content. For this reason it is imperative to get listed in Yahoo and Open Directory before you submit your site to the major search engines. There are only 3 reasons anyone will ever give a link to your site. They have seen it and they really liked what they saw, you agree to pay them or you agree to give a link back to them.

The more popular your site becomes the more people will link to it without any prompting from you. Buying links can be very expensive but in the case of some portals or industry trade sites it may be worth purchasing a link. The most cost effective way to get inbound links is to find sites with the same theme as yours and ask to trade links. Many site owners are afraid that if they put a link on their home page to a page of links that includes sites with related, relevant information - even competitors - that they are inviting people to leave their site. If the people who come to your site are more compelled to click on "links" than they are to go deeper into the site to see your information the site has bigger problems than not being sound in the search engines. There are several steps that can be taken to minimize this.

1. Create a page of links organized by category. If your company writes business plans have a link category for venture capital, start up consultants etc.
2.Below the link categories, add copy explaining how to get their link added to your site including the description you want to appear on their site.
3.Make the link to the page of links small.
4.When a link is clicked on make it open a new window so that your site remains open and they can return to your site.

It is very easy to check how many inbound links your site has. Go to google.com and enter link:anydomain.com and click search. You will see a list of all the pages that Google knows about that link to your site. Next go to www.altavista.com and conduct the same search. Chances are very good that the results will not be the same as the results of the Google search. Google and AltaVista are distinct individual companies and do not index the same sites. Sometimes your competitors are the best source of links. By doing the above searches using your competitors URL you will find sites that link to your competitors. This is a good pool to start asking for links from.

There is no perfect number of inbound links. Five or ten good quality links will do more for your site than hundreds of links from sites that are not same theme as your site. Once you have a link from a site make sure you submit the page that it is on to the major search engines. This will ensure that all of the sites are able to find and give you credit for the inbound links.

There are several software packages that attempt to automate the process. In all of them you enter the keyword or phrase that defines your theme and the software finds sites that are being found under that same search term. The software gathers all of the sites and creates a page with links to all of them and in some cases a description of the site. You then write an email to the site owner (or whatever email address is listed) advising them you have placed a link from your site to theirs and that you would like a reciprocal link. For a free copy of Link Crafter linking software go to pageviews.com and navigate to the tool section.

Finding sites with the same theme as yours, asking for links, following up the requests, verifying the links have been added and in some cases teaching others how to add a link back to your site is very time consuming. Most sites will never put the effort required into it. You can gain a huge competitive advantage in the search engines if you take the time or put in the effort to get inbound relevant links.

InBound vs OutBound links

Link Popularity Explained and How To Build Links - Inbound links...
( Page 2 of 4 )
Inbound links:
Inbound links to a web site are links that originate from an outside web site. An example would be a link on SiteB pointing into a page on SiteA.
SiteA (----- SiteB

Outbound links:
Outbound links from a web site are links pointing to a page on an outside web site. An example would be a link on SiteA pointing out to a page on SiteB.
SiteA -----) SiteB

There are two further classifications of links:
Reciprocal links:
An example of a reciprocal link is when SiteA links to SiteB AND SiteB links to SiteA, the link is reciprocated by both parties.
SiteA -----) SiteB
SiteA (----- SiteB

To achieve a high link popularity the type of links to build are inbound one-way links. This simulates how natural links are created i.e. links that people create to point to your site because they found it worth linking to. For outbound links it is natural to assume that they might decrease link popularity of a web site but this is not true. You do not give away your link popularity when you link to another site. They do not add to your link popularity though.


ICT Nieuws.

Nuttige items

WA40.nl - WA40 New Media