Beginner’s Guide To The Basics of Search Engine Optimization

No Comments

Basics of Search Engine Optimization Beginner’s Guide will thoroughly cover all topics for basic search engine optimization. Whether you are looking to get better rankings for your own website or blog, or you just want to make income, you’ll need a solid SEO strategy.

How Do Search Engines Spider Work – Basics of Search Engine Optimization

Search engine spiders are by far one of the most useful things to come around in the last 25 years of the internet. They are useful not only to the websites (Google and many others) that use them but also to people who are searching for a particular site and those who run websites. Spiders allow your site to be seen by the millions of people who use search engines every day.  In this newsletter, we will discuss what search engine spiders do, how they work, and how to set up a robots.txt file and upload that to your site to keep spiders from visiting your site.

What are spiders and what purpose do they serve?

Spiders are essentially programs that “crawl” sites and report back to their superior (Google or whatever search engine they were created for) what their findings are. Their purpose is to make it easy for your site to rank in search engines.

You might be wondering, what does it mean to “crawl” a site?

Well, it means to visit and site and copy the information.

How do spiders work?

Spiders work by finding links to websites, visiting those websites, going through the content of a website, and then reporting the content of the site back to the database of the site for which they are working for. Google spiders, thus, crawl sites and report the information back to Google’s database. From there, the information is added to Google’s search engine, and the site then shows up in Google search results. Much the same process happens with any other search engine spider.

How can I keep spiders from visiting my site?

You might be thinking, “why would I want to keep such a useful thing from visiting my site?” Well, the short answer is, sometimes site owners don’t want the spider to crawl on a particular part of their site. Some site owners don’t want spiders to crawl their site at all. The reasons for not wanting a spider to crawl a site or a particular part of a site vary, although most of the time it is because the site is either completely spam or features a page or two of spam.

If you’re one of those site owners, then you’ll want to create and upload something called a robots.txt file. We will briefly go over how to do this.

A robots.txt file

The whole purpose of a robots.txt file is to tell a search engine spider not to crawl the site or part of the site on which the robots.txt file resides.

Creating the file

Creating a robots.txt file that blocks out spiders is easy.

First, open up a notepad. Then, copy and paste the following:

User-agent: *

Disallow: /

Once you’ve done that, save the file as “robots” and as a .txt file.

Uploading the file

Next, you will upload the file to the part of your site which you do not want the spider to visit. So, if you don’t want them to visit, you’ll upload robots.txt to the news folder. If you don’t want the search engine spider to visit your site as well, upload robots.txt to your index folder.  That’s all there is to it.

Using the robots.txt file to make sure search engine spiders DO

visit your site

Believe it or not, the robots.txt file can be used to both disallow and allow search engine spiders to crawl your site.  Here’s how to create and upload such a file.

Creating the file

Open up notepad and copy and paste in the following:

User-agent: * Disallow:

You’ll notice that the only difference between this and the earlier example is that Disallow: is not followed with /. If it were, that would tell spiders to go away. Once again, save the file as robots.txt.

Uploading the file

All you’ll do is upload the robots.txt file to the part of your site that you want the robot to pay a visit to. So if you want the robot to see the whole site, just put the robots.txt file right alongside the index file.

 Creating and uploading a robots.txt file to help make sure spiders don’t miss your site is fast and easy. So what are you waiting for? Create and upload that file now!

Quick Site Indexing – Basics of Search Engine Optimization

Quick indexing in search engines is something every website owner wants to do. Lots of site owners don’t really know how to do this, though. Fortunately for them, there are several ways to index pages quickly, all of which we will go over in this article. These ways include submitting the URL, using keywords, using links, and using blogs.

Tip #1: Submit your site map manually to search engines

Search engines like Google, Bing, and Yahoo all have options for you to manually submit your sitemap. Before submitting, however, it is important to make sure of the following:

1.   All pages on your site are complete. Search engine spiders won’t touch incomplete sites.

2.   Your site is not full of spam and/or excessive use of keywords. Yet again, spiders won’t touch spam sites, which means your site will be skipped over if it contains lots of spam.

When you submit the sitemap, make sure to submit it only once to each search engine. Submitting multiple times to the same search engine will get you nowhere.

Tip #2: Use keywords to get spiders to come to your site

Search engine spiders absolutely love sites with good, well-placed and well-used keywords and phrases. In other newsletters, we have covered the correct use of keywords and key phrases, so you might already be familiar with it. If not, we’ll briefly touch on it right now.

Keywords are words that are used frequently on a site that search engines look for. Correct usage of keywords is to make them fit naturally throughout a piece of content and not look like spam. Before writing your content for your site, take a second to decide what the content is about. Then make a list of keywords or keyphrases that seem to fit with the topic at hand.

Next, write the content and try to sprinkle in the keywords or keyphrases throughout the content in a natural, non-distracting way. When you finish writing the content, count the number of keywords or key phrases.

Then divide that by the total number of words in the content. Consider the number you get is between 1-3%, you’re good to go. Keep in mind, if it’s below 1%, try to naturally add in keywords or key phrases. If it’s above 5%, try to take out keywords or key phrases. Aim for the perfect balance of between 1 and 3%.

Tip #3: Use relevant internal links and External Links

Having another site link to you will make spiders that visit the first site come to your site as a result of the link. Many owners of sites related to yours are happy to post a link to your site on their links page. Some might require what we refer to as “reciprocal linking”. This basically means that if they place a link to your site on their page, they want you to return the favor by posting a link to their page on your site. Doing so helps both sites better gain the attention of search engine spiders and helps them index quickly.

Tip #4: Install a blog on your site

Blogs are extremely popular and quickly listed by search engines. Places like Google typically index (or list a site/blog) far faster than they list regular websites. Blogs index quickly in search engines like Google within days of publising, whereas it can take weeks and months for search engines to notice a site.

Installing a blog on your site is really easy. All you have to do is install a program such as WordPress on your website and then use that to make blog postings. If you run a sports site, for instance, you could try making twice-a-week posts on the blog about sports news. Doing so will help your site as a whole get noticed quicker by search engines.

If you apply just one of the tips above to your website, you will get listed quicker than expected.

Creating a blog on your website is quicker than ever before. Do it today and watch how quickly and effortlessly your site gets listed on search engines!

Submissions & Resubmissions – Basics of Search Engine Optimization

Sometimes, search engine spiders don’t visit your site even after it has been up for a month or too. Or sometimes the site page that indexes on a search engine, but then taken down. In either case, it is time for the website owner to submit or re-submit the site to a search engine. This process isn’t overly difficult, but there are some guidelines website owners should follow in order to ensure success. We’ll be going over where you can submit sites, how to submit sites and how to re-submit sites.

Where can I submit my site to?

There are literally hundreds of different search engines to which you can submit your site to. We’ll go over just 5 of the most popular ones here.


Google is definitely the most popular search engine around, as by millions of people use it each day. You won’t usually have to manually submit sites to Google, as their spiders are pretty good about finding new sites. However, you occasionally will have to do this. It’s pretty easy and we’ll go over the process later in this newsletter.

One thing to keep in mind is that you should submit your site not just to the well-known search engines, but also to the smaller ones as well. The more pages search engines index, the more visitors you are likely to get.

How to Submit Webpages to Google

Follow these instructions to submit specific pages (URLs) to Google for index consideration.
Keep in mind Google that you are able to submit up to 10 page URLs per day through Search Console.

1.   Log in to Google Search Console and select the desired website property. (Here’s how you can set up Google Search Console for your website.)

2. From the menu, choose URL inspection.

3. Type the URL of the webpage you want Google to crawl (must be in the selected site). Press Enter.

4. The URL Inspection report displays the latest crawl data for this page. Next, click Request Indexing.

5. After Google tests the URL to make sure it exists, you will see the message “Indexing requested.” Click Got it to close the dialog box.

6. URL Inspection Tool In Google Search Console

 How to Submit a Site to Bing (and Yahoo)

You can submit up to 10,000 webpage URLs per day to Bing. When you submit your site to Bing, you also submit the website to Yahoo automatically since Bing feeds Yahoo’s web search index.

Here’s how to submit a site to Bing …

1. Sign in to Bing Webmaster Tools. (We recommend setting up a Bing Webmaster Tools account. Fortunately, it’s easy to do. If you already have Google Search Console, Bing can pull your verification info from there.)

2. Select the appropriate website if you have more than one.

3. From the menu, choose Configure My Site > Submit URLs.

4. Type or paste in the URLs you want to submit for indexing from the website, one per line.

5. Click Submit.

6. Submit URLs With Bing Webmaster Tools

How to Re-submit Website Pages

Before you try to re-submit a page, take the following into consideration.

1.   Has it been at least a few weeks since you originally submitted the url?

2.   Was your webpage taken from the search engine because the content on it that was inappropriate? If so, you should reconsider re-submitting your URL, as it will only get taken down again if it has inappropriate content on it. If it has been several weeks since you submitted your web pages, or you believe your site was pulled from the listings accidentally, then you should re-submit the site. To re-submit the url, all you have to do is follow the same process as you did when you submitted the site before.

Search engine submission is fast and easy. If your site isn’t listed on the search engines, take a minute to manually submit it. It’ll help boost your traffic!

Search Engine or Directory? – Basics of Search Engine Optimization

Getting your site listed in search engines and directories is an important part of building a successful site. Most people tend to think of search directories and engines as one and the same.  However, they are actually both quite different. The only similarity between the two, in fact, is that they both serve the same purpose: to help people find websites of interest to them.  In this newsletter, we will talk about the various search engines and directories, what their differences are, and why each is essential to delivering traffic to your site.

Search Engines:

By far the most popular search engine around is Google. Other major search engines include Bing and Yahoo.


Probably the most popular example of a directory is Yahoo. Another great directory includes Open Directory Project (

The Differences between Search Engines and Directories – Basics of Search Engine Optimization

Difference #1: Search engines are run by robots; directories are run by human editors

The most notable difference between search engines and directories is that search engines aren’t do not use human editors like directories do. Instead, search engines updates are dependent on algorithms and robots—spiders and crawlers.  What are spiders and crawlers? They are programs r search engines use to locate a pages, browse them, and then report whatever they find back to the search engine’s database. What is found is then listed on the search engine.

Directories, on the other hand, are completely updated by human editors. What usually happens is that a person submits their site to the directory and then the editor will visit the site and see whether or not it is worthy of being listed in the directory. If it is, it will be grouped with other relevant sites in the directory. If not, it won’t be listed at all.

Directories are typically harder to get listed in than search engines are for this reason. Human editors won’t allow in unworthy sites; search engine bots will. Directories receive updates very frequently by the editors.

Difference #2: Search engines are free to get listed in; some directories are not

Many directories charge money for each submission. Search engines never charge a dime because their bots are solely responsible for listing websites. There are, however, free directories to which you can submit your site.

Difference #3: Search engines are more popular and more used

People want convenience. Search engines offer this. All people have to do is type in a keyword into the search engine and relevant sites come up. Those who use directories have to go through the hassle of going through categories and sub-categories. This takes time and patience, both things that many busy people just don’t have.

Difference #4: Directories have listings grouped together by topic; search engines can sometimes be organizational issues

When someone visits a directory, all they have to do is choose a category that interests them. Then, all the sites are relevant to that category will come up on the page. While search engines are usually effective, unrelated listings can sometimes come up in a search for a particular thing. There’s nothing worse than doing a search for cars and having an unrelated listing for cats come up. Unfortunately, this is occasionally the case with search engines because algorithms and robots control the updates, not humans.

Why are Search engines and Directories important to your site? – Basics of Search Engine Optimization

There are a few reasons why both are important to your site’s traffic. The most obvious reason is that the more you are listed, the more likely it is people will visit your site. If people see your site in both search engines and directories, they are more likely to visit it than if they just saw it in a search engine.

Even though directories aren’t as popular now, they are still well-used by people because they neatly group together sites in a particular category. So if you run an online web design business and are listed in the web design category of the directory, everyone who looks for web design in that directory will see your site.

Search engines are also important to your site because they will show your site if a particular keyword(s) is typed into the engine. If your site is listed in both directories and search engines, you will have more traffic than if your site were listed in one or the other.

Be sure to do whatever it takes to ensure that your site gets listed in both and you will reap the benefits.

 Search engine bots and easy directory submission make it quick and painless for your site to get maximum exposure from both of these wonderful tools. So go ahead and take the steps necessary to get listed!

Page Length For Search Engines’ – Basics of Search Engine Optimization

You’ve decided to create content for your web pages, but aren’t sure how many words each page should have to ensure that search engines and visitors come back often. It’s a dilemma that many website owners face. They don’t want to have too few words, but they don’t want to have too many words, either. It’s a delicate balance that every website owner has to know in order to have a site that is search engine optimized and visitor friendly. The appropriate length for pages, and the reason why your pages should adhere to this length requirement.

What is a good length for pages?

Ideally, each page should have at least 700 words, but no more than 1500 words. Anything less than 700 words and search engine spiders might just skip over your site entirely; anything more than 1500 and spiders won’t take the time to search through the content for keywords.

When I write the content, what should I strive to do?

You should strive to write content that is entertaining and easy to read. Don’t drone on and on just to meet the 700-word requirement. Find things to talk about that are relevant to your site and come up with ways to say them in a concise manner.  Remember that search engine bots aren’t the only thing to come to your site—people are also coming to your site and they will want to read content that makes them want to visit again. Aim to please them first, then the search engine bots.

How do I please search engine bots? – Basics of Search Engine Optimization

Pleasing search engine bots is as simple as having 700+ words of content and some keywords sprinkled into your content. Keywords are words that the search engine bots immediately spot and use to figure out how to rank your site. In a document of 700 words, you’ll want to have at least 7 keywords, but preferably around 21, throughout your content. When using keywords, make sure to sprinkle them naturally throughout the document so that you do not distract your reader from what the content is about.

Why should I have pages with content between 700 and 1500 words?

The reasons for this are simple. Having pages with content of 700-1500 words will please visitors and search engine spiders.  Pleasing both will help your site attract more traffic and become successful.

Visitors who come to your site want to see useful content. If your content is short, chances are, it’s not very useful to visitors. Good content is content that is worthwhile to read and tells the visitor something they don’t already know. It must be packed with information and details. Few can accomplish this in less than 700 words, which is why it is often necessary to use that many words.

However, it is also important not to be too wordy with your content. People want to be able to quickly get the content they need. They don’t want to read through hundreds of words of fluff in order to get what they came for. So be concise and try to make sure your content doesn’t go over 1500 words.

Spiders who come to your site want to have a good amount of content to “chew” on. That is, they want to have enough content to report back to the search engine database. Less than 700 words really aren’t enough content for a spider to bring back to a search engine database, so spiders tend to give less weight to pages with under 700 words.

Similarities Between Spiders and Humans

Spiders, much like human visitors, also don’t like pages with excessive amounts of words. Pages with 1500 words are usually a big turnoff to spiders, and even if a spider does visit the page, chances are they won’t report the contents of the page back to the search engine database for which they work. That is why you must strive to be concise not just for human visitors, but for the robot(spider) visitors as well.

If you are able to limit your pages to between 700 and 1500 words, you will do well with human visitors and search engine spiders. That’s the goal of every website owner and should be your goal as well.

Writing content between 700-1500 words is easy as long as you come up with good content to write about and try to be as concise as possible. It’s just that easy!

Content Search Engines Love – Basics of Search Engine Optimization

One of the most important steps to take to ensure that your site is successful is to have a site that is well-loved by search engines. This is actually fairly easy to do—but also fairly easy not to do. It is a known fact that well-placed keywords and keyword phrases on a piece of content will attract search engine spiders. It is also a well-known fact that keyword stuffing—using keywords way too much—will repel spiders and stop them from putting your site on a search engine.

So what is the delicate balance of having just the right amount of keywords and keyword phrases? How do you achieve it? How do you make sure you aren’t overusing keywords? In this newsletter, we will go over all of the above to help make your site keyword optimized.

How do I write content that the search engines love? – Basics of Search Engine Optimization

The easy answer is to make sure your content has several keywords. But how is this done in a natural way so as not as to hurt your traffic from actual human beings? Well, it all comes down to naturally putting keywords into your content. We’ll briefly go over this process.

#1: Sit down and decide what your content will be about

If your site is all about sports, figure out what kinds of sports content you’ll put on there. Maybe you’ll write about basketball or golf or football or any other sport. Or if you’re running a music site, you could write content about a particular style of music or band/artist. The point is, make sure that whatever content you write is relevant to your site.

#2: Make a list of keywords that are relevant to your content

So if you’re writing a music article, these keywords could be:






Hip Hop


But try to narrow your list down to 2 or 3 keywords and use additional long-tail keywords to increase your chance of ranking. The first keyword should be a primary keyword(used at least 10 times in a 500-word document) and the other one or two should be secondary keywords(used a few times each in a 500-word document).

#3: Begin writing the content

As you write each sentence, try to think about where the primary and secondary keywords might fit in. Wherever it seems natural, use the keywords. However, you should never try to make them fit where they don’t fit. If it looks unnatural, don’t use them.

#4: Read over the content

When you read over the content, try to read it as a visitor would. Do the keywords you’ve tried to incorporate in the text distract you from the meaning of the content? Do the keywords seem blatant?  If they do, rewrite the content to make it flow more naturally.

#5: Count the number of keywords and plain words

If you have a 500-word piece of content, you’ll want to see around 5-15 primary keywords sprinkled throughout the content—a keyword density of at least 1%, but ideally 3%. Keyword density is the number of keywords divided by the total number of words.  A keyword density of 1% in a 500-word piece of content would be 5 keywords, while a keyword density of 3% in a 500-word piece of content would be 15 keywords. Strive for 3%.

#6: Rewrite your content as necessary to have enough keyword density.

If you don’t have enough keyword density or have too much, rewrite the content.

Why is keyword “stuffing” bad for my site? – Basics of Search Engine Optimization

That’s why keyword stuffing is a bad thing for your site—it will actually keep spiders from visiting. Make sure you do not have a keyword density of much more than 5%. Even 5% is considered too high.

It’s oftentimes a difficult balance for website owners. How do you please both visitors and search engines? Well, it’s not easy, but if you follow certain steps, it isn’t that difficult, either.

So which is more important, to please the visitors or the search engine spiders? The unequivocal answer is to please the visitors. What good is a site that attracts spiders but not actual people? And what good is a site that only attracts some visitors but not search engine spiders? In this newsletter, we will go over writing content that interests and pleases both your readers and the spiders.

So how do I write content that pleases a visitor?

First, stick to writing content that is relevant to your site.  That means that if your site is about Rock music, you should not have any content about dogs, as that only makes your site look bad and repels visitors.

Second, write content in an easy-to-understand, conversational format. Do not use big, fancy words just for the sake of looking smart or pleasing search engine bots.  I can’t count the number of times I’ve visited a site with content that is so hard to comprehend that I do not wish to ever come back to that site again. You want to make a good first impression on anyone who takes the time to look at your site, so make sure your content is easy to understand.

Third, never ever write content that is long, dull, and boring.  If the point you are trying to get across can be said more concisely in 500 words, then why waste another 300 words droning on and on about the topic? This is a huge turnoff to potential visitors.

Fourth, make sure that all of your content is grammatically correct. I know, this is hard because we live in the instant messenger world, where sentences like “how r u?”, are thought to be acceptable. However, anyone who is well-educated will appreciate good grammar. Make your site shine in this department.

Fifth, don’t overuse keywords and keyword phrases. In other words, don’t make it blatantly obvious to the reader that you are trying to attract search engine spiders to your site. Make an effort to make sure that your keywords and keyword phrases flow into the content of the article. This is easier said than done but can be accomplished with a little fine-tuning.

But what about search engine spiders? How do I please them? – Basics of Search Engine Optimization

The only way you can possibly displease a search engine spider is by overusing a keyword/keyword phrase and making your site smell like spam. Search engine spiders are now more advanced than ever, and so they are better able to ignore sites that are full of spam. Too many keywords or keyword phrases that are blatantly there will hinder your site from being crawled by spiders.

As is mentioned in another of our articles, a keyword density of 1-3% is generally considered to be good. Any less than 1% is bad and will make it harder for your site to get listed on search engines; any more than 4% makes your site look like spam.  If you haven’t checked out our other newsletter, keyword density is basically the number of keywords or keyword phrases in a piece of content divided by the total number of words.

Before you write your article, take some time to make up a shortlist of keywords that are relevant to the topic at hand. Then try to naturally sprinkle them into your content, so that your content will please both the search engine bots and your readers. If you are able to do that, you will have a successful site in no time. Not only will the search bots love you, but actual people will, too!

 Writing content that is good for both people and search engines is absolutely essential to making your site a powerhouse. So follow the rules above and you will be able to write excellent, pleasing content!

Meta Tags, Titles & Their Uses – Basics of Search Engine Optimization

Title and Meta Tags can be quite helpful—and effective—to gain more visitors to your site. However, they are only as effective as you make them. Proper use of meta and title tags is absolutely essential to making them a good thing for your site. In this newsletter, we will go over the basics of title & meta

tags—using proper keywords, writing good meta tag descriptions, and using the title tag effectively.

The basics to creating good meta tags

It is best to plot out how you’ll use meta tags on a particular page before even writing the content for the page. Lots of people will write the content of the page and then try to sprinkle in keywords in the meta tags. This is an ineffective way of doing things, as it usually makes your page confusing to visitors.

Before writing content, take a minute to pick out 3-4 primary keywords. Then take another minute to write phrases from those words. Once you’ve done that, you can begin writing your content around the keywords.

How do I select proper keywords to enter into the meta tags?

This is an excellent question. Really, there are two main rules of thumb for selecting proper keywords.

#1: Base your keywords on the content of your site.

For instance, if you are running a digital photography site, your keywords could be digital, photographs, photography, etc.  You’ll want to steer clear of silly keywords that don’t relate to your site. These would include keywords completely irrelevant to your site.

#2: Make sure your keywords can flow naturally throughout the content of your site.

Try to pick words that you can easily incorporate into the content of your site and that makes sense. You must remember that actual people are reading your site—the content(and keywords) MUST make sense to not only search engines, but also to the people who visit your site. Thus, the keywords you choose must fit the content of your site perfectly.

How do I write a good description tag?

What description tags essentially do is briefly inform the search bot/web crawler what a site is about. Therefore, to write a good description tag, you must be able to write a good, brief description of what your site is about. It is important to note that you are limited to 200 characters or less in the description tag so be as concise as possible.

Let’s use the photography example again. Your site is based on digital photography. So your description tag should say that your site is related to digital photography.

Example description tag: “Digital photography site, digital photos, buy digital photos, print digital photos”.

The search engine bot/web crawler will see this and will place the site in relevant search results. So if someone searches for digital photos, they should see your site somewhere on the list.

The same principle can be applied to any and every site. Good description tags are as simple as a good, concise description of what your site is about.

How to use the Title tags Effectively – Basics of Search Engine Optimization

There are some misconceptions about Title tags. People have said in the past(and some still do today) that a good, effective Title tag consists of keywords. However, this is just not true.  Every good title tag consists of two things—the name of the site and a brief description of what is on the site.

One thing to keep in mind is that the Title tag is also the title of your listing in any search engine. Why would you want your title listing to consist of keywords? It won’t help your site at all—it will actually hurt it because people are less likely to take the site seriously due to silly keywords. Another thing worth noting is that the Title tag should never be more than 80 characters. Generally speaking, the briefer the Title tag is, the better. So strive to be concise!

So what is a good, effective Title tag?

Well, a good, effective Title tag should first have the name of your website (so if it’s, it should be “Digital Photography”) and then a short description of the site. In this example, you could put “Digital Photography – Buy and Print Digital Photos Here”. This is both an accurate and effective Title tag for this particular site. Apply this principle to your site and you will have a good and effective Title tag.

Meta tags and titles are incredibly useful to your site. Master the above techniques and you will be able to create effective meta tags and titles that will help your site earn a higher search engine ranking and more visitors!

Don’t Get De-Indexed – Basics of Search Engine Optimization

As search engine optimization has grown more popular, so has the use of unethical SEO (that’s shorthand for search engine optimization) techniques. But what are these “black hat” techniques? Why are they so bad?

Better yet, what can you, as a website owner, do to legitimately increase your search engine ranking? These are all questions which this newsletter will answer. After you’ve read this, you’ll know what techniques are underhanded and which are acceptable.

What are “Black Hat” techniques? – Basics of Search Engine Optimization

“Black Hat” techniques are unethical techniques that some website owners use to get their site listed on search engines. They usually use these techniques to get a high search engine listing. Here is a list of three common “Black Hat” techniques you should avoid if you want to please search engine spiders.

1.   Keyword stuffing. Keyword stuffing is the overuse of keywords in a piece of content. Generally, this is repeating the same keywords over and over just to achieve a higher search engine ranking.

2.   Invisible text. Quite a few cheap sites use this tactic, which involves making keywords in a font that is the same color as the background, so that readers can’t see the massive amounts of keywords, but search engine spiders can.

3.   Doorway pages. These are pages that regular visitors cannot see, but search engine spiders can. They are done to trick the search engines so that the site gets a higher ranking.

Why are “Black Hat” techniques bad?

They’re bad because they go against the rules set forth by search engines. Not only do they go against the rules, but they also hurt the visitor’s experience. Who wants to see a site stuffed with keywords? “Black Hat” techniques are unethical and wrong.

Do “Black Hat” techniques work?

Yes, they do, which is why people use them. But these techniques only work temporarily. Eventually, the search engine spiders catch on and your site is permanently banned from being listed.  This is why you should never use “Black Hat” techniques. It may pay off for a few short weeks, but it permanently hurts your site and its credibility on the internet.

So if I can’t use “Black Hat” techniques, what techniques can I use to help my site get listed high on search engines?

Fortunately, there are multiple techniques that you can use to get your site legitimately listed high on search engines. We’ll go over two such techniques right now.


Earlier in this article, we talked about keyword stuffing.  While keyword stuffing is an awful thing to do, the natural use of keywords is perfectly fine. By natural, we mean keywords that are spread throughout a document in a way that isn’t blatant.

How do you naturally use keywords in your content to benefit your site? Well, before you even write your content, take a few minutes to identify some keywords that are relevant to your site. Then, begin to write the content. Try to incorporate the keywords you have picked out in a natural way throughout the content. Ideally, if your content is 600 words, you’ll want to use the main keyword between 6-18 times, which is a keyword density of 1-3%(keyword density is the number of keywords divided by the total words of a document). Anything less than that won’t be beneficial to your site. More than 3% keyword density might seem like keyword stuffing, so try not to go much past 5% keyword density.

One thing to keep in mind is that actual people are reading your site. Thus, you should make sure that the use of keywords does not distract your readers from the whole meaning of the content.


Linking is a very common practice between websites. How do you do it? Well, you ask a site that is relevant to yours to post a link to your site on their site. In return, you can offer to link to their site on your site. That way, both sites get a benefit from the linking.

How does this help your listing? Well, each time a search engine spider visits a page with a link to your site on it, the spider will then visit your site.

If you keep the above in mind, your site will get listed high on a search engine with no danger of being pulled.

While “Black Hat” techniques temporarily do work, they never pay off in the long run. So if you want a legitimately high search engine listing, don’t use “Black Hat” techniques.  Use the natural methods for high search engine rankings!

Parting Words of Encouragement

Search engine optimization can be very confusing especially when you are just getting started. You have made the first step to creating an SEO strategy and is one that you won’t regret. And even better you are well on your way to creating an SEO plan from the comfort of your own home.

Certified Expert on Organic & Paid SEO (Local and National), Social Media Marketing, Content Marketing, Online Reputation Management, Website Development and Design.

Send Me a Proposal

We offer professional SEO services that help websites increase their organic search score drastically in order to compete for the highest rankings even when it comes to highly competitive keywords.

About us and this blog

We are a digital marketing company with a focus on helping our customers achieve great results across several key areas.

Subscribe to our newsletter!