Tuesday, September 28, 2010

How To Get To The First Page Of Google?

#1 – Paraphrasing Mike Grehan, author of The Search Engine Marketing Book

If you create a page about a unique word or phrase, you can easily rank for that term.

As an example, if you were to create a site about xandrough, Google will want to return that site when a searcher uses that term because your site would be the most relevant to that term. The problem of course is that nobody is likely to type that into any search engine because the word does not exist, so ranking for it won’t do any good.

However, how about a keyphrase like:

Landscape Design Sussex County, NJ

Now, the possibilities of this concept seem pretty exciting for Landscapers located in Northwest, NJ. The sites are returned in the organic lack a sense of relevancy. Is that because, like xandrough, it is not likely to be typed into the engine. I don’t think so, as you can see many landscapers are competing for the phrase in Adwords. It actually looks to me like a phrase that is juicily near a conversion worth thousands of dollars.

Try a search like this for your business plus a regional term. If you don’t find many sites that match that query, there may be an opportunity.

#2 – Search Engines Return Pages Not Sites on the Results Page

With this in mind for the example above; the page has to be about Landscape Design in Sussex County, not the site. In other words, if one page on your site is about landscape design and another page is about sussex county, no page of your site will come up for the keyphrase: ‘landscape design sussex county’.

This can be used to the advantage of the linkless site, making it possible to come up in the results for multiple search queries. For example, one page could be for Lawn Maintenance, while another could be Landscape Design.

#3 – The Order and Proximity of The Keywords Matter

Write a sentence or phrase that contains both the service and the geographic term and have those terms in close proximity.

#4 – Time

It will take some time before the search engines trust a linkless site. So, you may just to have to wait. However, if your site is a couple of years old and you re-write it, applying some of these principles, you may be able to rank for your terms as soon as the next time Googlebot visits your site.

To speed some things up even more you may want to add your business listing to Google Maps and verify your business listing. After filling out the information you will recieve an automated telephone call and you will have to enter the numbers they provide you.

Also, you could fill out the business profile on superpages.com. Don’t just set up a listing, you want to make sure you add your url to the business profile… Google only spiders business profiles on superpages.com, not the rest of the site.

The above options are free… the links they provide won’t have any value in ranking your site in search results but they may help your site get indexed quicker by google. And they both get traffic in their own right and could score you some work.

Sunday, September 19, 2010

Competitive Keywords. Name

Highly competitive keywords and phrases

Keyphrases such as personal loan and cheap mortgage are highly competitive. The competition is very fierce to rank highly in search engines for these keywords and phrases, as they are part of a very lucrative market, especially in the current economic climate. The top websites (those shown in the first 30 results for personal loan) will have spent a lot of time and money reaching their high ranking and will be proactive in staying there. In this case, other methods of promotion could offer a cheaper and quicker source of targeted visitors to your site. These methods include:

1.Viral marketing
2.Conventional forms of advertising such as radio
3.E-mail marketing
4.Paid for Internet advertising such as Google adwords

Domain name

If possible, purchase a domain name that sums up your products and services. There is no proof that this helps within search engines but it does help in many other areas. For example, some people looking for specific information may just enter a domain name into a browser address bar and hope to strike lucky instead of using a search engine.

An example would be www.bouncycastlehire.co.uk. The domain name sums up exactly what this Company provides, is very easy to remember and stands out in a page of search engine results. Standing out in a page of SERP’s is important, as you will have 9 other websites to compete against.

Imagine you have a family and are looking to buy a new car. When at the show room you are faced with a row of 10 cars all with the same technical specification but one has a sign in it that states “ideal for families”. Would you request details about this car first? If you can create the perception that your website has the information required by the domain name alone, then you have already generated interest.

Try to stay away from hyphenated domain names if possible. Although this has no negative effect in the search engines, many people forget to put it in and may reach one of your competitor’s websites instead. Research what other domain names with similar spelling there are and check to see if they are competitors. Deliberately purchasing a domain name very similar to a direct competitor (e.g. same words but spelt or arranged differently) can attract a negative legal response from the other owner, especially if you provide similar products or services.

Having to write www.InternetConsultancyandMangment.com into a browser would be unusually cruel!

Sunday, September 12, 2010

What The Search Engines Say - Teoma

To truly understand what sets Teoma apart from the competition, you first need to know the 3 primary techniques used by search engines today:

Text Analysis - determines a site’s relevance by the text on the page. This technique was fine when the Web was small and spammers couldn’t artificially increase their rankings.
Popularity - determines that the more links there are to a site, the more popular it is. However, this is not necessarily the best judge of relevance.
Status - goes beyond popularity by analyzing the importance or "status" of the sites providing the incoming links. But this lacks context because it doesn’t calculate whether the incoming links are related to the search subject.
Subject-Specific Popularity This is the Teoma difference. Teoma uses elements of the above three techniques, but it does more. Rather than rely on the recommendations of the entire Web audience to determine relevance, Teoma technology uses Subject-Specific Popularity to connect with the authorities - the experts within specific interest groups that guide it to the best resources for a subject.

The Teoma Web Crawler FAQ

The Teoma Crawler is Ask Jeeves' Web-indexing robot (or, crawler/spider, as they are typically referred to in the search world). The crawler collects documents from the Web to build the ever-expanding index for our advanced search functionality at Ask Jeeves at Ask.com, Ask.co.uk and Teoma.com (among other Web sites that license the proprietary Teoma search technology).

Teoma is unique from any other search technology because it analyzes the Web as it actually exists - in subject-specific communities. This process begins by creating a comprehensive and high-quality index. Web crawling is an essential tool for this approach, and it ensures that we have the most up-to-date search results.

Q: What is a Web crawler/Web spider?
A: A Web crawler (or, spider or robot) is a software program designed to follow hyperlinks throughout a Web site, retrieving and indexing pages to document the site for searching purposes. The crawlers are innocuous and cause no harm to an owner's site or servers.

Q: Why does Teoma use Web crawlers?
A: Teoma utilizes Web crawlers to collect raw data and gather information that is used in building our ever-expanding search index. Crawling ensures that the information in our results is as up-to-date and relevant as it can possibly be. Our crawlers are well designed and professionally operated, providing an invaluable service that is in accordance with search industry standards.

Q: How does the crawler work?
A: The crawler goes to a Web address (URL) and downloads the HTML page. The crawler follows hyperlinks from the page, which is URLs on the same site or on different sites.The crawler adds new URLs to its list of URLs to be crawled. It continually repeats this function, discovering new URLs, following links, and downloading them.

The crawler excludes some URLs if it has downloaded a sufficient number from the Web site or if it appears that the URL might be a duplicate of another URL already downloaded. The files of crawled URLs are then built into a search catalog. These URL's are displayed as part of search results on the site powered by Teoma's technology when a relevant match is made.

Q: How frequently will the Teoma Crawler download pages from my site?
A: The crawler will download only one page at a time from your site (specifically, from your IP address). After it receives a page, it will pause a certain amount of time before downloading the next page. This delay time may range from 0.1 second to hours. The quicker your site responds to the crawler when it asks for pages, the shorter the delay.

Q: How can I tell if the Teoma crawler has visited my site/URL?
A: To determine whether the Teoma crawler has visited your site, check your server logs. Specifically, you should be looking for the following user-agent string:

User-Agent: Mozilla/2.0 (compatible; Ask Jeeves/Teoma)

Q: How did the Teoma Web crawler find my URL?
A: The Teoma crawler finds pages by following links (HREF tags in HTML) from other pages. When the crawler finds a page that contains frames (i.e., it is a frameset), the crawler downloads the component frames and includes their content as part of the original page. The Teoma crawler will not index the component frames as URLs themselves unless they are linked via HREF from other pages.

Q: What types of links does the Teoma crawler follow?
A: The Teoma crawler will follow HREF links, SRC links and re-directs.

Q. Does the Teoma crawler include dynamic URLs?
A. We include a select number of dynamic URLs in our index. However, they are screened to detect likely duplicates before downloading.

Q: Why has the Teoma crawler not visited my URL?
A: If the Teoma crawler has not visited your URL, it is because we did not discover any link to that URL from other pages (URLs) we visited.

Q: How do I register my site/URL with Teoma so that it will be indexed?
A: We appreciate your interest in having your site listed on Ask Jeeves and the Teoma search engine. It is important to note that we no longer offer a paid Site Submission program. As a result of some recent enhancements to Teoma, we're confident that we're indexing even more Web pages than ever, and that your site should appear in our Search index as a result of our ongoing "crawling" of the Web for new and updated sites and content.

If you are the owner/webmaster of a site in question, you may also want to research some online resources that provide tips and helpful information on how to best create your Web site and set up your Web server to optimize how search engines look at Web content, and how they index and trigger based upon different types of search keywords.

Q: Why aren't the pages the Teoma crawler indexed showing up in the search results at Teoma.com?
A: If you don't see your pages indexed in our search results, don't be alarmed. Because we are so thorough about the quality of our index, it takes some time for us to analyze the results of a crawl and then process the results for inclusion into the database. Teoma does not necessarily include every site it has crawled in its index.

Sunday, September 5, 2010

What The Search Engines Say - MSN Part II

Items and techniques discouraged by MSN Search

- The following items and techniques are not appropriate uses of the index. Use of these items and techniques may affect how your site is ranked within MSN Search and may result in the removal of your site from the MSN Search index.
- Loading pages with irrelevant words in an attempt to increase a page's keyword density. This includes stuffing ALT tags that users are unlikely to view.
- Using hidden text or links. You should use only text and links that are visible to users.
- Using techniques to artificially increase the number of links to your page, such as link farms.

About site ranking

MSN Search site ranking is completely automated. The MSN Search ranking algorithm analyzes factors such as page contents, the number and quality of sites that link to your pages, and the relevance of your site’s content to keywords. The algorithm is complex and never human-mediated. You cannot pay to boost your site’s relevance ranking; however, we do offer advertising options for site owners.
Each time the index is updated, you may notice a shift in your site’s ranking. As new sites are added and some sites become obsolete, previous relevance rankings are revised.
Although you cannot directly change your site’s ranking, you can optimize its design and technical implementation to enable appropriate ranking by most search engines.

About your site description

As the MSN Search web crawler MSNBot crawls your site, it analyzes the content on indexed pages and generates keywords to associate with each page. Then MSNBot extracts page content that is highly relevant to the keywords (often sentence segments that contain keywords or information in the description) meta tag to construct the site description displayed in search results. The page title and URL are also extracted and displayed in search results.

Updating your site description
Site descriptions are extracted from the content of your page each time MSNBot crawls your site and indexes its pages. If you change the contents of a page, you may see a change in the description the next time our index is updated.

Since the descriptions are extracted from your indexed web pages, the best way to affect your site description is to ensure that your web pages effectively deliver the information you want to see in search results.

Excellent content design and effective use of terms that target your message are the best ways to affect the site description that MSNBot extracts from your site. Effective strategies include:

Placing descriptive content near the top of each page.
Making sure each page has a clear topic and purpose.
Add a site description into the description meta tag

Sunday, August 29, 2010

What The Search Engines Say - MSN Part I

Technical recommendations for your website

Use only well-formed HTML code in your pages. Ensure that all tags are closed, and that all links function properly. If your site contains broken links, MSNBot may not be able to index your site effectively, and people may not be able to reach all of your pages.

If you move a page, set up the page's original URL to direct people to the new page, and tell them whether the move is permanent or temporary.

Make sure MSNBot is allowed to crawl your site, and is not on your list of web crawlers that are prohibited from indexing your site.

Use a robots.txt file or meta tags to control how MSNBot and other web crawlers index your site. The robots.txt file tells web crawlers which files and folders it is not allowed to crawl. The Web Robots Pages provide detailed information on the robots.txt Robots Exclusion standard.

Keep your URLs simple and static. Complicated or frequently changed URLs are difficult to use as link destinations. For example, the URL www.example.com/mypage is easier for MSNBot to crawl and for people to type than a long URL with multiple extensions. Also, a URL that doesn't change is easier for people to remember, which makes it a more likely link destination from other sites.

Content guidelines for your website

The best way to attract people to your site, and keep them coming back, is to design your pages with valuable content that your target audience is interested in.

In the visible page text, include words users might choose as search query terms to find the information on your site.

Limit all pages to a reasonable size. We recommend one topic per page. An HTML page with no pictures should be under 150 KB.

Make sure that each page is accessible by at least one static text link.

Create a site map that is fairly flat (i.e., each page is only one to three clicks away from the home page). Links embedded in menus, list boxes, and similar elements are not accessible to web crawlers unless they appear in your site map.

Keep the text that you want indexed outside of images. For example, if you want your company name or address to be indexed, make sure it is displayed on your page outside of a company logo.

Sunday, August 15, 2010

Adsense Golden Rules

Google Adsense scam or not. If you are one of those who tried Google Adsense and failed or got banned when you were near your payout, it looks disappointing and even more when Google does not tell you the reason. Why did Google kick you out? Well, most of the time we never know. Once you have been kicked out, you cannot join back. So to prevent this, I listed below a very, very abbreviated version of Google Adsense T&Cs and policies. If you follow these rules you should have no problems.

* Do not encourage users to click your ads.
* Do not put ads on pages with no content, pop-up, pop-under, error page, registration or similar pages.
* Do not overlap ads with content that users cannot distinguish between.
* Do not use automated bots to increase clicks.
* Do not encourage or participate in ‘click groups’ that click each others ads.
* Make sure you don’t display more than the maximum number of ads on a page. Check with the Google Adsense rules.
* Do not create more than one Adsense account. You CAN have more than one site for a single Adsense account.
* Do not edit or modify the Adsense code ( does not include changing properties).
* Do not redirect users away from any advertisers page.
* Do not click your own ads (not even to test them).
* Do not display pornographic, hatred or any other banned content.
* Do not buy banned sites, typically MFA from others.

Enjoy!

Google.Com

Monday, August 9, 2010

What The Search Engines Say - Lycos

How do I improve the ranking of my web pages in search engines?

Although we cannot guarantee your placement within search results for particular keywords, the following tips will help you to ensure that your pages are spider friendly:

Write great content that human searchers would understand and do not try to trick the Search Engine's algorithms.

Use keywords that searchers use to find your web site in the meta-data. Use your web logs to determine the keywords. Don't guess!

Make sure your web content mentions those keywords near the top of the page. For instance, place the keywords in the headline or in the first paragraph on the page.

Repeat keywords more than once within your web page, but don't over do it. Too much repetition is considered spam.

How can I make my site spider-friendly?

• Speed: If your site is slow, it will affect the length of time it takes to spider the web site. Try to build pages with few and small graphics.
• Title: Spiders won't index the information if TITLE tags are the same on every page. (TITLE tags are displayed at the very top of the browser.)
• Descriptions: META description tags can be included for each web page. These can provide a better search result description than a spider-created excerpt.
• Registration: Spiders can't traverse a site if there is a username/password in their way. If you must have users login, set up a separate site where the spider can access the content.
• Search-based Sites: Spiders function by following hyperlinks. Purely search-based sites cannot be spidered. Therefore, create a "spider.html" file (i.e. a list of URLs on the site for the spider to traverse).

How do I improve the ranking of my web pages in search engines?

Although we cannot guarantee your placement within search results for particular keywords, the following tips will help you to ensure that your pages are spider friendly:

• Write great content that human searchers would understand and do not try to trick the Search Engine's algorithms.
• Use keywords that searchers use to find your web site in the meta-data. Use your web logs to determine the keywords. Don't guess!
• Make sure your web content mentions those keywords near the top of the page. For instance, place the keywords in the headline or in the first paragraph on the page.
• Repeat keywords more than once within your web page, but don't over do it. Too much repetition is considered Spam.

Thursday, August 5, 2010

What The Search Engines Say - Yahoo!

Yahoo! strives to provide the best search experience on the Web by directing searchers to high-quality and relevant web content in response to a search query.

Pages Yahoo! Wants Included in its Index

• Original and unique content of genuine value
• Pages designed primarily for humans, with search engine considerations secondary
• Hyperlinks intended to help people find interesting, related content, when applicable
• Metadata (including title and description) that accurately describes the contents of a web page
• Good web design in general

Unfortunately, not all web pages contain information that is valuable to a user. Some pages are created deliberately to trick the search engine into offering inappropriate, redundant or poor-quality search results; this is often called "spam." Yahoo! does not want these pages in the index.

What Yahoo! Considers Unwanted

Some, but not all, examples of the more common types of pages that Yahoo! does not want include:

• Pages that harm accuracy, diversity or relevance of search results
• Pages dedicated to directing the user to another page
• Pages that have substantially the same content as other pages
• Sites with numerous, unnecessary virtual hostnames
• Pages in great quantity, automatically generated or of little value
• Pages using methods to artificially inflate search engine ranking
• The use of text that is hidden from the user
• Pages that give the search engine different content than what the end-user sees
• Excessively cross-linking sites to inflate a site's apparent popularity
• Pages built primarily for the search engines
• Misuse of competitor names
• Multiple sites offering the same content
• Pages that use excessive pop-ups, interfering with user navigation
• Pages that seem deceptive, fraudulent or provide a poor user experience

YST's Content Quality Guidelines are designed to ensure that poor-quality pages do not degrade the user experience in any way. As with Yahoo's other guidelines, Yahoo reserves the right, at its sole discretion, to take any and all action it deems appropriate to insure the quality of its index.

Friday, July 23, 2010

What The Search Engines Say - Google - Part II

When your site is ready:

• Once your site is online, submit it to Google at http://www.google.com/addurl.html.
• Make sure all the sites that should know about your pages are aware your site is online.
• Submit your site to relevant directories such as the Open Directory Project and Yahoo!.
• Periodically review Google's webmaster section for more information.

Quality Guidelines - Basic principles:

• Make pages for users, not for search engines. Don't deceive your users, or present different content to search engines than you display to users.
• Avoid tricks intended to improve search engine rankings. A good rule of thumb is whether you'd feel comfortable explaining what you've done to a website that competes with you. Another useful test is to ask, "Does this help my users? Would I do this if search engines didn't exist?"
• Don't participate in link schemes designed to increase your site's ranking or PageRank. In particular, avoid links to web spammers or "bad neighbourhoods" on the web as your own ranking may be affected adversely by those links.
• Don't use unauthorized computer programs to submit pages, check rankings, etc. Such programs consume computing resources and violate our terms of service. Google does not recommend the use of products such as Webposition Gold™ that send automatic or programmatic queries to Google.

Quality Guidelines - Specific recommendations:

• Avoid hidden text or hidden links.
• Don't employ cloaking or sneaky redirects.
• Don't send automated queries to Google.
• Don't load pages with irrelevant words.
• Don't create multiple pages, sub domains, or domains with substantially duplicate content.
• Avoid "doorway" pages created just for search engines, or other "cookie cutter" approaches such as affiliate programs with little or no original content.

These quality guidelines cover the most common forms of deceptive or manipulative behaviour, but Google may respond negatively to other misleading practices not listed here, (e.g. tricking users by registering misspellings of well-known web sites). It's not safe to assume that just because a specific deceptive technique isn't included on this page, Google approves of it. Webmasters who spend their energies upholding the spirit of the basic principles listed above will provide a much better user experience and subsequently enjoy better ranking than those who spend their time looking for loopholes they can exploit.
If you believe that another site is abusing Google's quality guidelines, please report that site at http://www.google.com/contact/spamreport.html. Google prefers developing scalable and automated solutions to problems, so we attempt to minimize hand-to-hand spam fighting. The spam reports we receive are used to create scalable algorithms that recognize and block future spam attempts.

Sunday, July 18, 2010

What The Search Engines Say - Google - Part I

Following these guidelines will help Goggle find, index, and rank your site, which is the best way to ensure you'll be included in Google's results. Even if you choose not to implement any of these suggestions, we strongly encourage you to pay very close attention to the "Quality Guidelines," which outline some of the illicit practices that may lead to a site being removed entirely from the Google index. Once a site has been removed, it will no longer show up in results on Google.com or on any of Google's partner sites.

Design and Content Guidelines:

• Make a site with a clear hierarchy and text links. Every page should be reachable from at least one static text link.
• Offer a site map to your users with links that point to the important parts of your site. If the site map is larger than 100 or so links, you may want to break the site map into separate pages.
• Create a useful, information-rich site and write pages that clearly and accurately describe your content.
• Think about the words users would type to find your pages, and make sure that your site actually includes those words within it.
• Try to use text instead of images to display important names, content, or links. The Google crawler doesn't recognize text contained in images.
• Make sure that your TITLE and ALT tags are descriptive and accurate.
• Check for broken links and correct HTML.
• If you decide to use dynamic pages (i.e., the URL contains a '?' character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them small.
• Keep the links on a given page to a reasonable number (fewer than 100).

Technical Guidelines:

• Use a text browser such as Lynx to examine your site, because most search engine spiders see your site much as Lynx would. If fancy features such as JavaScript, cookies, session ID's, frames, DHTML, or Flash keep you from seeing all of your site in a text browser, then search engine spiders may have trouble crawling your site.
• Allow search bots to crawl your sites without session ID's or arguments that track their path through the site. These techniques are useful for tracking individual user behaviour, but the access pattern of bots is entirely different. Using these techniques may result in incomplete indexing of your site, as bots may not be able to eliminate URLs that look different but actually point to the same page.
• Make sure your web server supports the If-Modified-Since HTTP header. This feature allows your web server to tell Google whether your content has changed since we last crawled your site. Supporting this feature saves you bandwidth and overhead.
• Make use of the robots.txt file on your web server. This file tells crawlers which directories can or cannot be crawled. Make sure it's current for your site so that you don't accidentally block the Googlebot crawler. Visit http://www.robotstxt.org/wc/faq.html for a FAQ answering questions regarding robots and how to control them when they visit your site.
• If your company buys a content management system, make sure that the system can export your content so that search engine spiders can crawl your site.

Wednesday, July 7, 2010

Getting Indexed

Getting Indexed

Once your site is listed in a search engine then you are “indexed”. To be indexed you can ask other sites that are already indexed to have a link pointing to you, get listed in a directory (detailed later) or manually submit your site to the search engines (only needs doing once)

There seems to be little or no logic to the time frame of being indexed so treat the search engines as temperamental beings and you will keep your sanity. Some sites will be listed in a matter of days others will take months.

You may notice that I continually talk about Goggle mostly even though there are other major search engines out there. This is because figures show that Goggle handle 47.3% of searches and Yahoo is next with 20.9 % with MSN at 13.6%.

What the search engines say

I will post in the next days extracts for webmasters or website owners for site ranking and positioning. These were all taken direct from the search engine sites but not all from one page. Where possible we have tried to include all relevant information in a fluid style but as you can imagine with search engines wanting to keep algorithms secret, they don’t give too much away. If you are brand new to website and Internet marketing then there are a few pearls of wisdom in there. If not then the rest of this blog offers more valuable information.

Thursday, July 1, 2010

Algorithms

Algorithms are not just terms you hear being used frequently in Star Trek, they are also the foundations for good search engines. In short and in search engine terms, an algorithm is a set of rules that governs each websites position in relationship to others containing the same keywords and phrase. For example, with two websites selling the same unique product, they both cannot be listed at position one so the search engine has to decide which is the more relevant website for the given search phrase and show it above the other in the search results. Just because one website shows above another for one phrase, it does not mean it always will for others. There are hundreds of factors which affect ranking in search engines many of which we can only guess at.

These algorithms are very closely guarded secrets because should they become known, then website owners could easily change or create websites to become the first site listed for certain keywords or phrases. This happened early on in the Internet boom and search engines would return results that were completely irrelevant to the searcher. The more relevant results a search engine returns then more people will use it and so the more work search engines put in to ensure highly focused results are returned to searchers.

Monday, June 28, 2010

Search Engines

A search engine is like an online version of the yellow pages and is a major part of your business website success (it is possible to be successful without search engines by using viral marketing and other methods but for the time, cost and effort, they usually proportionally yield more fruit). Internet users will go to one of many search engines such as www.google.com. At the time of writing, Google is the search engine of choice within the Internet community and therefore the one most business website owners concentrate on ranking well in.

Search Engine Technology

Search engines work by sending out pieces of software called spiders. These visit websites, make a copy of what they find and send it back to a central database (for those not in the know, a database is a logical method of storing large amounts of data that can be searched easily and quickly). These work 24/7 and most search engines have their own “spiders”. The spiders missions is to collect as much information about as many websites as possible and send it back. With new sites being added to the Internet every day and old ones being changed, this is a never-ending task.

Around every month or so, search engines will reorganise their database to take account of discontinued websites, new ones added and existing ones changed. This is why when you enter a search term into a search engine, the results returned can be vastly different from month to month. What was listed at the top of page 1 for a key phrase last month is now relegated to page 5 for the same phrase. With Google, this is affectionately known as the “Google shuffle” although the technical term is re-indexing.

This keeps the returned results current and accurate. Search engines spend a lot of time and money on getting this right as the more accurate a search engine is at returning the information requested the more people would use it. This is how Goggle has become the market leader in the search engine market because their results are usually very accurate and they provide many useful tools in narrowing down large searches.

Sunday, June 27, 2010

Online Forms

Online forms are where the visitor fills in a form of varying length and this is converted to an email and sent to the Company. Some Companies ask all sorts of information not related to your query and to make matters worse, they won’t let the form be sent until all the fields have been filled in. If you look at your website statistics (details how many people visited the site and how they used it) and see that the contact us page was the page most visitors exited, then this is probably the reason why.

If you are going to use one of these, only make the most crucial information required. If you want to take the opportunity for a customer questionnaire then offer an incentive such as a free download, a discount voucher or something similar.

E-mail address

Your website can be your first point of contact with a new customer and using a free email address such as hotmail or Wanadoo not only goes against convention but looks unprofessional and can lead to assumption.

Questions such as “why are they using a free email address when they have their own domain?”, “emails are usually provided free with hosting, don’t they know how to use them?”, “if they don’t know how to use them, they can not be very technically competent” and “if they are not technically competent, do I want to be giving them my credit card details online?”. You can see how the questions lead to assumption! Hopefully you will never want to use your personal email address ever again on your business website after reading the last paragraph.

You may think that fred@yoursite.com is fine for your site for all email contact but consider using several emails, even if you are a one man band. Support@yoursite.com, info@yoursite.com, bookings@yoursite.com or many other variations all give the impression of a larger and professional organisation. It also assures the sender that their email will go straight to the department or person responsible and not into a general inbox that will be actioned as soon as they have got through the 100’s of other emails. It is extra work for little perceived.

Reassurance

When you walk into a shop, you can see that it is a permanent business unlike a tele-sales phone call or faxed offer.

As there is nothing tangible in a website, then you will be faced with the same issues. These can be overcome with reassurances that you are a real business by showing pictures of your premises and your staff. Listing all trade organisations you belong to, your VAT registration number and Ltd Company number will also achieve the same goal.

If you are selling on the Internet then you definitely need to mention the security measures you have taken to keep credit card details secure and detail your privacy policy regarding their personal information. See the legal section for more information.

Saturday, June 26, 2010

Contacting The Business

You need to make it as easy as possible for people to contact you because the best way to alienate your audience is to provide limited and slow to response contact details such as a postal address. There are several approaches to make your immediate contact details prominent.

• The website incorporates your contact details as part of the graphic design. Every page will show your contact details. Making 0800 or 0845 numbers prominent also goes a long way to prompting people to call. The free phone call might be the only reason your organisation was chosen over others.
• Anywhere you have written content that promotes the viewer to contact you should have a link to the contact us page embedded within the copy e.g. “to find out more information about our widgets then contact us”
• Making email addresses links to open up a new email (There are concerns about email harvesting programs which are covered later, the choice is yours to make between potential spam emails or ease of use for your customers)
• Have an interactive online chat facility on your site. This enables customers to immediately pose a question to a member of your organisation.
• Call me back button. Visitors click this and enter a preferable time to be called back.

Web page readers tend to scan information at least 25% faster than if they were reading a newspaper or magazine so you need to make contact details or action buttons and statements (to purchase click here, to join our newsletter click the button etc) larger and more noticeable that its surroundings. If not then you risk that visitor leaving your website to search elsewhere.

Imagine you walk into a spare parts shop with all the goods shown behind the counter but no one is at the front desk to take your order or answer your questions. How long would you wait for someone to turn up? Unless your need was urgent for those products, seeking another option e.g. another shop would soon become a preferable option to waiting about.

In summary, make it very easy for website visitors to contact you immediately. A point to note is that it is also a requirement that all e-commerce shops clearly state their full trading address anyway.

Friday, June 25, 2010

Google Sandbox. Your Target Audience

It is worth writing about this one particular search engine phenomenon, as they seem to be the only one to have it. The “Google sandbox” is reputed to be a sort of holding area or status assigned to your website which prevents it from ranking well in the first 6-9 months of its existence for major keywords but the less competitive ones perform ok.

Gossip, rumour and a few self-professed experts all say that this phenomenon prevents webmasters from trying to manipulate the search engines by creating a site with many fake or dubious links to them and ranking well immediately for competitive key phrases. If this was the case then what a good idea! We all have a level playing field, no one can cheat and only serious website owners would be ranked well….in theory. True or not, there seems to be little to no hard evidence available to confirm it.

Your target audience

Identifying and knowing your existing or potentially new target audience will guide the design and content of the site. Our bright yellow chicken man for hire business knows that has 2 target audiences, which are private party and corporate bookings. The private audience are likely to be male or female, 18 to 45, with a large sense of humour, pay close attention to price and looking to add humour or surprise to a private gathering of people. The other audience will be corporate sector who will be time poor 25-45 year old junior to middle management or their secretaries who don’t care about price, as they are not paying for it, the Company is. So long as they turn up, that’s their job done.

Writing one set of content for such a diverse audience can be next to impossible so creating 2 unique sales pitches and styles of writing would be best suited here. Having “corporate entertainment” and “private parties” buttons in the navigation structure will help ensure the right people are looking at the right content.

The time poor executives will want short, concise content, whereas the private sector will want confirmation and guarantees that their hard earned cash will yield the desired result (a good night out) and lots of examples of previous parties to reassure them. By researching and setting objectives for each target audience, you are maximising your chances of them doing what you desire e.g. contact you, make a booking etc.

Existing businesses will already know who their customers are and can change their website accordingly. New businesses can either research or make educated guesses. These soon become less of a guess and more of a confirmation as your business grows and you collect more data about your website and business customers.

Thursday, June 24, 2010

The First 30 Results

An accurate measure of keyword competition is to see how well optimised the first 30 websites are. Indications of optimisation for a keyword or key phrase are when it:

• Is used at least once in the page title
• Is used in the Meta description (found by viewing the source of the HTML page e.g. in Internet Explorer, look at the top of the page and click view, then source and look for meta name="description" The keywords and phrases will be listed after this.
• Is used in the headings of the page content
• Is used several times in the content (typically 3%-9% of the words)
• Shown in the navigation structure as a link
• Is used in the anchor text of a link pointing to that site

Goggle provides a couple of excellent search tools to gauge competition for your keywords and phrases. These are:

Allintitle. If you start a search with allintitle: Google will restrict the results to those with all of the query words in the page title. For instance, allintitle: Internet Consultant will return only documents that have both Internet and Consultant in the title. Using speech marks around your search keywords will provide a further focused result e.g. allintitle:”Internet Consultant” will show only websites that have the exact phrase in the page title.
Intitle. If you use intitle: in your query, Google will restrict the results to websites containing that word in the title but consider other places in the document too. For instance, intitle:Internet Consultant will return documents that mention the word Internet in their title, and mention the word consultant anywhere else in the document. This may sound the same as the previous tool but is not. The allintitle: looks only at page titles but the Intitle: works by considering page titles and page content.
Inanchor. The inanchor: command will tell you which websites have the keyword or phrase contained within incoming links to their website e.g. inanchor:keyword shows all websites that have a link pointing to them using your keywords. Using the three search tools together can provide an accurate figure of competing websites e.g. allintitle:keyword inanchor:keyword as this shows websites with the keywords and phrases in the page title and contained within the incoming links.

The curved glass manufacturer we mentioned earlier is an excellent example of using less competitive keywords and phrases to give the new website a kick-start. A check of Google reveals that the term glass returns nearly 5,000,000 results with big companies and organisations such as Pilkington and the British Glass website coming in the top 30 results. These are large corporate sites that are well established and have been around for a long time in Internet terms. All the websites in the top 10 were highly optimised for the same keyword so trying to drive traffic to the curved glass manufacturer website for the term glass in the short term would have been very difficult. Instead, we suggested going after less competitive terms to start with.

Checking PPC pricing

PPC (pay per click) is another good indication of competition level for your keywords and phrases. Go to http://www.google.co.uk/ads/ and sign up for the Goggle adwords program (free to sign up and use the tools). The more you have to pay to be on the first page of results the more competitive your keywords and phrases are.

A mixture of checking the first 30 results for a given keyword and phrase and PPC costs will give you an idea of what you are up against.

Wednesday, June 23, 2010

Competition Level For Keywords And Phrases

The competition level of a key phrase can be measured in several ways. These include:

• Checking how many websites a search engine returns containing your keywords and phrases.
• How well optimised each site or page is in the first 30 results, including the number of links pointing to it.
• Check the pricing for you keywords and phrases in pay per click advertising.

Number of websites listed

If you are going to check the number of websites containing your keywords and phrases then simply enter them into the search engine and then check how many results are returned.

This only gives you a guideline to the amount of web pages containing that term, in that search engine and does not truly reflect the competition level. It is possible and in some cases likely that the hundreds of thousands of websites returned have not deliberately optimised their site for this term e.g. only mentioned the term once in their content and not in the page title and so achieving higher rankings is easily achieved.

Keywords - Niche

So now you have established your business website objectives, you need to find your most effective keywords (a single word frequently entered into search engines) and keyphrases (a collection of keywords entered into search engines). Keyword research tools can be found at:

• http://inventory.uk.overture.com/d/searchinventory/suggestion (free)
• https://adwords.google.co.uk/select/Login?sourceid=AWO&subid=UK-ET-ADS&hl=en_GB (may require free account to use)
• http://wordtracker.co.uk (requires payment)

These tools will tell you how many people are searching on these terms, usually for search engine traffic monitored from the previous month.

As most visitors to a website come from search engines, then it makes good business sense to make sure your website is "search engine friendly" to ensure they refer traffic, or at least the right kind of traffic, to your website. Many factors affect the type and number of visitors visiting your site but content and services are the most important. If your content or services are poor then so will your website results. Poor content can be summarised as that which offers little or no value to website visitors e.g. pure advertising and marketing "waffle" and services cover useful features such as online calculators, research tools etc.

Your new website will be more successful to start with if you concentrate on a specific area of your business with limited or no competition in search engine terms.

Search engine users typically enter 1-3 words into a search engine to find the information they are looking for e.g. cheap holiday Egypt and rarely go past the third page (30th result) of search engine results pages (SERP's) as they have usually found what they are looking for by then.

As we mentioned before, single search words are called keywords e.g. holiday and several keywords together are called keyphrases however a competitive keyword or phrase is generally accepted as one that many websites are trying to make their website "search engine friendly" for.

Using less competitive keywords and keyphrases should make it quicker and easier to be listed within the top 3 pages (10 results per page) for that particular keyword or key phrase. The closer you can get to page one, position one within the SERPS (search engine results pages), the more likely a searcher will click the link to come through to your website.

Being listed within the first 30 results in a search engine for a given keyword or key phrase means you "rank well" for those or your site is "optimised" for them.

By all means, still use the keywords and phrases that apply directly to your business that are competitive but expect these to be slower to deliver Internet traffic due to many other business websites competing for them. For example, if you are selling books then it is very unlikely you will pose a threat to www.amazon.co.uk and have your new website placed above it in the search engine results pages for people looking for the general search term books.

Having your website shown at the top of the SERP’s for such a highly sought after phrase would take an expert a lot of time and money. Instead concentrate on a specific line of books, as it would be easier to rank higher for engineering books than for just books.

Quick Overview Of A Web Page

For those unsure of some of the elements of a web page, the following explanation should help.

• Page title: Look at the top of any Internet Browser window and you will see a blue bar. In it should be a description of what the page is about.
• Headings: Look at the web page text. Headings are the descriptions or summaries of what the following text is about, e.g. “quick overview of a web page” is a heading. These are usually shown in a different font or size, in bold or italic. Headings can even be shown in the same style as the rest of the page text but listed in the source code of the web page as a heading (you don’t need to worry about that for now, just be aware that headings are relevant)
• Content: Are the actual words that you can see in a page and usually set out in paragraphs. Words shown in buttons and graphics are not considered content.
• Navigation structure: Most websites will have a series of buttons or links shown on every page that will navigate you to the rest of the website.
• Alt text: When you leave your mouse over a picture or graphic for a couple of seconds, a small yellow box appears with text in it. These are used to describe the picture for people using text only browsers and for the vision impaired.
• Meta tags: Information not visible on a web page. It is stored within a technical document which makes the web page and describes elements of the page such as a description of the contents, author and more.

Tuesday, June 22, 2010

Setting Objectives

You should identify what you want your website to achieve and clearly identify your expectations and objectives. These can also guide you in making later decisions about the site e.g. if increased sales is an objective and you sell a niche product then advertising on a generalised website will be of little benefit.

Website objectives can typically be:

1. Increase sales
2. Improve product or Company branding
3. Reduce time spent with customers asking similar questions
4. Reduce recruitment costs
5. Improve customer relationship management
6. Reduce administration such as invoicing and telephone answering
7. Collect information to better understand your business and customers
8. Reach new markets
9. Reduce costs by seeking new suppliers

The business owner should set and review the objectives frequently because they know current and future business requirements. The list is usually specific to each organisation and can be from one objective to many. As your business changes, so will your objectives, this is one of the main reasons why you should permanently monitor the site.
If you know what your objectives are then you have something to measure against for your success or failure. If you do not know what your objectives are, you cannot measure its success or failure so why pay money for something you don’t know is working or not?

Extra guidelines to follow

The following points may make your site more effective:

1. Collect data about your customers, even if it is only their email address as this gives you a chance to re-capture their attention at a later date.

2. Provide testimonials from previous clients as it builds trust.

3. Show membership to any credible organisations that you may belong to as this shows commitment to your business.

4. Have a customer service policy or mission statement, something that shows the level of service you are aiming to provide.

5. Easy to understand terms of business. Large amounts of legalese and “we are not responsible for anything” will create mistrust and suspicion.

6. Privacy policy, especially if you collect information of a sensitive nature such as personal details and credit card information as this reassures the visitor you will respect their privacy.

7. Consider a viral marketing strategy as these can work faster and longer than a search engine strategy. Combining the two can be very effective. Viral marketing works on the principle that your customers or website visitors communicate details of your company, products or services to others usually by forwarding an email, brochure, free product etc. Hotmail is an excellent example of viral marketing as a link to upgrade or install hotmail was sent in every email message sent.

8. Try to build a relationship with your customers by using tools such as newsletters (with a double opt in approach), membership area etc. This is beneficial because the mistrust barriers are removed once the initial contact or purchase has been made.

9. Advertise for CV’s even if you are not actively recruiting but do mention this, as it will stop the endless phone calls and desperate job seekers. You may receive an outstanding CV that you will make room in your organisation for. If not then there are plenty of networking opportunities to be found from talking to other people in a similar line of business as your own. Other useful indicators of CV’s are the Company health of your competition, market trends and many more.

10. Avoid background images unless it adds to your site. Black text on white background is the easiest to read quickly. Web users tend to speed-read text at up to 25% faster rates than paper and back ground images do not help this.

11. If you are going to use photographs then use a professional to take them or use websites selling royalty free photographs on the Internet. Obvious snap shots take away from the professional design of the website.

12. Have a 1-2 line phrase, which sums up your business which is easy to remember e.g. a scaffolding company which states “we have the biggest erections in town” or a curved glass manufacturer using “we bend like no other”. The more ways of separating yourself and being remembered from other “standard” websites, the better. Think of how many catchy marketing phrases and tunes you have remembered and why. If we mention “you do the shake and vac to put the freshness back” or “you can’t get better than a quick fit fitter”, do the tunes instantly come to mind?

Monday, June 21, 2010

8 Minimum Standards To Achieve

The following standards are recommended to achieve.

Most successful small to medium sized business websites should achieve the following:

1. Have objectives, which are predetermined before ANY work is started. These will be the foundations of the whole site. The directors/owners of the Company should set the main objectives preferably after reading this entire list. Involving key departments such as sales, customer service and even the person who answers the telephone will reveal areas of the business that can be streamlined and improved, like your sales staff may request quotes be conducted online so as to spend less time on the phone. Get every department to list their wants and needs, you may be surprised at how many can be achieved using your website.

2. Identify your target audience, write the content and structure the site for them. If you sell reading glasses then you definitely do not want to be using small blue fonts on a grey background. Other examples include time poor executives who would appreciate short and to the point content, interior designers like lots of pictures, small business owners like free useful tools, younger generation go for interactivity such as games, men go for how does it work and women go for how does it make you feel. In summary, know who you are trying to write for and write accordingly.

3. Search engine friendly and so constructed in a way that they can read every page you want them to. If the search engines are not reading your site then this is the same as paying for thousands of leaflets to be printed and then not distributed.

4. Optimise the web site for search engines so that the content you have written for your customers is appearing in the first 3 pages of search engine results, preferably showing on the first page for your keywords and phrases. This is achieved by identifying which keywords and phrases to chase after, analysing the competition for those keywords and then implementing the best and most achievable ones into your site.

5. Use a recognised sales approach. Your web site should offer benefits for its visitors, not standard advertising blurb. Remember that people do not buy drill bits because they are interested in sharpened pieces of toughened steel, they want the holes they create so talk about the end benefits of your products or services to your customers, not how great and fantastic the Company is.

6. Be quick to download for those on dial up modems.

7. Use multi media (pictures, sounds and movies) to achieve an objective, not because your web designer offered you a deal.

8. Have an Internet strategy for your website. Only you will know what works for your product or service once you have tried different methods. Research, implement and monitor what works best for you, like paid for advertising or organic search engine placement.

Case Study

One Company I have spoken to supplies specialist light bulbs such as theatre and run way lighting to over 10,000 organisations.

Around 2001 the government announced a push towards e-government. Everything from VAT to land registry was reviewed to see what could be transferred from paper to electronic medium. This eventually filtered through to local authorities, some of which then informed their suppliers that by October 2005 they would not conduct business with organisations that would not invoice electronically, scanning a paper invoice and emailing it to them was satisfactory. Unless the light bulb supplier conforms, they face the loss of 7500 clients, 75% of their client base. This demonstrates how politics can change your business almost over night.

So you are convinced that an Internet presence communicating with a potential 850 million people Worldwide as of 2005 is worthwhile and you want to know how to make your website a success.

I Don't Need A Web Site

This is usually a statement made by business people who have been trading for a while, usually before the Internet took off, and "are doing fine as we are thank you very much!" or they really do not understand what the Internet is all about.

This statement will hold water if:
• You have no competition.
• You are not likely to have any competition.
• Your existing customers will only use you and no one else.
• Provide a product or service that will never be influenced by politics, economics, geography, life style changes, technology, demographics and so on.

The variables that affect a business are infinite, this being the reason why flexibility is a major part of business success and a website goes a long way to providing more flexibility and immediate response to changing influences on your business.

While the business "Internet phobes" are turning their back to the Internet, their competition are turning to it and reaping the rewards of having a competitive advantage. Business web site owners will be responding to change in minutes like different pricing, availability and so on. The "Internet phobes" will be recalling catalogues, changing advertising and so on.