Wednesday, February 27, 2008

Google Search Engine Vs Yahoo Search Engine

I suppose you people have read my previous article Strange Google Crawler. After messing with google search I had decided to include my site in yahoo search .as compared to google search , yahoo search engine has crawled my pages little fast, today as on 27th February yahoo search engine showing 2 links from my Blog up their and google has crawled all links from my blog. Now lets compare time span i.e amount of time taken by yahoo and google search engine to crawl my blog.

I had submitted my blog to google search far before December of last year and 1st link of up by January 16,2008 where as I had submitted my blog to yahoo search some where after February 8,2008.today when I had checked with both search engine here is statistics

Google Search site:random-view.blogspot.com results showing 13 links

Yahoo Search http://random-view.blogspot.com result showing 2 links with 32 backlink from my wordpress blog (I had created banklink intentionally, just to make google crawler to crawl more pages from this blog )

If I compare this to search engine based on their rate at which they crawl new blog faster then it will be difficult to give any conclusion. As I have no idea whether yahoo search engine takes up any links from google search engine or not. One point which do makes me to feel yahoo search engine might be taking google search result output for further reference is, links which are up in yahoo search engine are also up in google search far before yahoo has crawled those links. Well google crawler is going good with crawling here is graphs from google analytics

Activity of Googlebot for last 90 days on my blog

Number of pages crawled per day
Maximum -41 Average- 9 Minimum- 2






Number of kilobytes downloaded per day

Maximum - 1169 Average- 140 Minimum- 0










Time spent downloading a page (in milliseconds)

Maximum- 2517 Average- 582 Minimum-123






I don’t know whether this statistics are displayed real time or there is any time lag .well any way statistics are not showing any good sigh from googlebot. Graphs are in downwards direction for last few days.

Here is result of Yahoo site Explorer

Tuesday, February 26, 2008

Non IT Company… Can Screw IT career

This post may help to all those out their looking for some IT jobs. Getting an IT job in Non IT company is not so difficult. Like me I have joined HDFC BANK on 28th January. its just a Month only, and m frustrated with job profile. At the time of joining it was said that I will be part of some Support Management Group, and I m part of that group. but work which I do is no way near to any kinda IT job. Your are sitting in front of computer and Mailing to one department and other department or to vendor that’s what they mean by IT job. People @HDFC BANK (IT Center) getting enough of Pay but Job profile is not the one which Core IT company demands .if you are thinking about your long term IT career then this is not the place where you should ever work. I have seen people working from 9.00 of morning till late 9.00 of evening ,but doing nothing more then Mailing stuff.i was shock to know people at HDFC BANK(IT center) working for more then 4 to 7 years.well I am not planning to have any such long term career with HDFC which is just Screwing My IT career. I had never thought I will be doing any such things in my IT career……
Only one suggestion to new IT guys looking for IT jobs .Never Join any Non IT company, and never compromise with your Interest .Do what you Like to do,Love to do.

Link to previous post: Microsoft+Yahoo > Google

Wednesday, February 20, 2008

Microsoft+Yahoo > Google


Last week Microsoft offered to buy Yahoo! for $44.6 billion in cash and stock. This represents a 50% premium for shareholders, and indicates Microsoft's anxiety to beat Google. In my view, it also represents a tacit admission from Microsoft that Windows Live, MSN, etc., are all failing to win market share. As the Nielsen Online report indicates, Google is still comfortably controlling the market, and that's in spite of the fact that MSN/Windows Live are default home pages are virgin Vista/Internet Explorer installations.
This is a big play, even by Microsoft's standards, and is a sure indicator that Microsoft recognises that search is the key battlefield for this decade. Search is the big driver for all online marketing and content. Microsoft's increased muscle in this market, will paradoxically increase customer choice, and hinder Google's hegemony.

From an agency point of view, the biggest problem we face with Yahoo and Windows Live is poor programmatic interfacing. I'm hopefully Microsoft will throw similar resource at a decent API for Windows Live and Yahoo!. If you they couple the API with better incentives for intermediaries, large scale advertisers will have a realistic alternative to Google.

-----------------------------------------------------------
Read Other Articles
Microsoft+Yahoo > Google
Sitemap
Antivirus reviews 2008 (part 2)
Introduction to SEO Tutorials
How to optimize your site : SEO process
Basic link terminology
Taking the search engine point of view: why you wa...
What is search engine optimization (aka SEO)?
The best keyword research tools available
Antivirus reviews 2008 (part 1)
Strange Google Crawler
Creating Statspack job
-----------------------------------------------------------

Tuesday, February 19, 2008

Sitemap

What are Sitemaps?
Sitemaps are an easy way for webmasters to inform search engines about pages on their sites that are available for crawling. In its simplest form, a Sitemap is an XML file that lists URLs for a site along with additional metadata about each URL (when it was last updated, how often it usually changes, and how important it is, relative to other URLs in the site) so that search engines can more intelligently crawl the site.
Web crawlers usually discover pages from links within the site and from other sites. Sitemaps supplement this data to allow crawlers that support Sitemaps to pick up all URLs in the Sitemap and learn about those URLs using the associated metadata. Using the Sitemap protocol does not guarantee that web pages are included in search engines, but provides hints for web crawlers to do a better job of crawling your site.
Sitemap 0.90 is offered under the terms of the Attribution-ShareAlike Creative Commons License and has wide adoption, including support from Google, Yahoo!, and Microsoft
A Sitemap does not affect the actual ranking of your pages. However, if it helps get more of your site crawled (by notifying us of URLs we didn't previously didn't know about, and/or by helping us prioritize the URLs on your site), that can lead to increased presence and visibility of your site in our index


Sitemap File Format
Most Popular sitemap format is XML and it is supported by most of all type of Search Engine even google too pushing people to use XML format Sitemap
The XML Sitemap must:


  1. Begin with an opening tag and end with a closing tag.
  2. Specify the namespace (protocol standard) within the tag.
  3. Include a entry for each URL, as a parent XML tag.
  4. Include a child entry for each parent tag.

All other tags are optional. Support for these optional tags may vary among search engines. Refer to each search engine's documentation for details.


Other Sitemap formats
The Sitemap protocol enables you to provide details about your pages to search engines, and we encourage its use since you can provide additional information about site pages beyond just the URLs. However, in addition to the XML protocol, we support RSS feeds and text files, which provide more limited information.
Syndication feed
You can provide an RSS (Real Simple Syndication) 2.0 or Atom 0.3 or 1.0 feed. Generally, you would use this format only if your site already has a syndication feed. Note that this method may not let search engines know about all the URLs in your site, since the feed may only provide information on recent URLs, although search engines can still use that information to find out about other pages on your site during their normal crawling processes by following links inside pages in the feed. Make sure that the feed is located in the highest-level directory you want search engines to crawl. Search engines extract the information from the feed as follows:

  1. <'link'>field - indicates the URL
  2. Mdified date field (the field for RSS feeds and the date for Atom feeds) - indicates when each URL was last modified. Use of the modified date field is optional.

Text file
You can provide a simple text file that contains one URL per line. The text file must follow these guidelines:

  1. The text file must have one URL per line. The URLs cannot contain embedded new lines.
  2. You must fully specify URLs, including the http.
  3. Each text file can contain a maximum of 50,000 URLs. If you site includes more than 50,000 URLs, you can separate the list into multiple text files and add each one separately.
  4. The text file must use UTF-8 encoding. You can specify this when you save the file (for instance, in Notepad, this is listed in the Encoding menu of the Save As dialog box).
  5. The text file should contain no information other than the list of URLs.
  6. The text file should contain no header or footer information.
  7. You can name the text file anything you wish.

You should upload the text file to the highest-level directory you want search engines to crawl and make sure that you don't list URLs in the text file that are located in a higher-level directory.

Sitemap Location
The location of a Sitemap file determines the set of URLs that can be included in that Sitemap. A Sitemap file located at http://example.com/catalog/sitemap.xml can include any URLs starting with http://example.com/catalog/ but can not include URLs starting with http://example.com/images/.
If you have the permission to change http://example.org/path/sitemap.xml, it is assumed that you also have permission to provide information for URLs with the prefix http://example.org/path/. Examples of URLs considered valid in http://example.com/catalog/sitemap.xml include:

http://example.com/catalog/show?item=23
http://example.com/catalog/show?item=233&user=3453
URLs not considered valid in http://example.com/catalog/sitemap.xml include:
http://example.com/image/show?item=23
http://example.com/image/show?item=233&user=3453
https://example.com/catalog/page1.php
Note that this means that all URLs listed in the Sitemap must use the same protocol (http, in this example) and reside on the same host as the Sitemap. For instance, if the Sitemap is located at http://www.example.com/sitemap.xml, it can't include URLs from http://subdomain.example.com.
URLs that are not considered valid are dropped from further consideration. It is strongly recommended that you place your Sitemap at the root directory of your web server. For example, if your web server is at example.com, then your Sitemap index file would be at http://example.com/sitemap.xml. In certain cases, you may need to produce different Sitemaps for different paths (e.g., if security permissions in your organization compartmentalize write access to different directories).
If you submit a Sitemap using a path with a port number, you must include that port number as part of the path in each URL listed in the Sitemap file. For instance, if your Sitemap is located at http://www.example.com:100/sitemap.xml, then each URL listed in the Sitemap must begin with http://www.example.com:100.


Validating Sitemap
The following XML schemas define the elements and attributes that can appear in your Sitemap file. You can download this schema from the links below:
For Sitemaps: http://www.sitemaps.org/schemas/sitemap/0.9/sitemap.xsd
For Sitemap index files: http://www.sitemaps.org/schemas/sitemap/0.9/siteindex.xsd
There are a number of tools available to help you validate the structure of your Sitemap based on this schema. You can find a list of XML-related tools at each of the following locations:

http://www.w3.org/XML/Schema#Tools
http://www.xml.com/pub/a/2000/12/13/schematools.html
Some other good sites for validating your sitemap are
http://www.xml-sitemaps.com/

Submitting Sitemap
Once you have created the Sitemap file and placed it on your webserver, you need to inform the search engines that support this protocol of its location. You can do this by:
· submitting it to them via the search engine's submission interface
· specifying the location in your site's robots.txt file
· sending an HTTP request
The search engines can then retrieve your Sitemap and make the URLs available to their crawlers.
Submitting your Sitemap via the search engine's submission interface
To submit your Sitemap directly to a search engine, which will enable you to receive status information and any processing errors, refer to each search engine's documentation.
Specifying the Sitemap location in your robots.txt file
You can specify the location of the Sitemap using a robots.txt file. To do this, simply add the following line:
Sitemap:
The should be the complete URL to the Sitemap, such as: http://www.example.com/sitemap.xml
This directive is independent of the user-agent line, so it doesn't matter where you place it in your file. If you have a Sitemap index file, you can include the location of just that file. You don't need to list each individual Sitemap listed in the index file.
Submitting your Sitemap via an HTTP request
To submit your Sitemap using an HTTP request (replace with the URL provided by the search engine), iIssue your request to the following URL:
/ping?sitemap=sitemap_url
For example, if your Sitemap is located at http://www.example.com/sitemap.gz, your URL will become:
/ping?sitemap=http://www.example.com/sitemap.gz
URL encode everything after the /ping?sitemap=:
/ping?sitemap=http://www.yoursite.com/sitemap.gz
You can issue the HTTP request using wget, curl, or another mechanism of your choosing. A successful request will return an HTTP 200 response code; if you receive a different response, you should resubmit your request. The HTTP 200 response code only indicates that the search engine has received your Sitemap, not that the Sitemap itself or the URLs contained in it were valid. An easy way to do this is to set up an automated job to generate and submit Sitemaps on a regular basis.
Note: If you are providing a Sitemap index file, you only need to issue one HTTP request that includes the location of the Sitemap index file; you do not need to issue individual requests for each Sitemap listed in the index.

Excluding Content
The Sitemaps protocol enables you to let search engines know what content you would like indexed. To tell search engines the content you don't want indexed, use a robots.txt file or robots meta tag

Read other Articles

Sitemap
Antivirus reviews 2008 (part 2)
Introduction to SEO Tutorials
How to optimize your site : SEO process
Basic link terminology
Taking the search engine point of view: why you wa...
What is search engine optimization (aka SEO)?
The best keyword research tools available
Antivirus reviews 2008 (part 1)
Strange Google Crawler
Creating Statspack job

Antivirus reviews 2008 (part 2)

This brings us to the biggies, and Norton is probably the biggest antivirus brand that's out there, so let's start with that

Norton Antivirus
Symantec product page
Symantec has this habit of coming up with minor variations of Norton every year: they're more or less the same, or only rarely significantly improved (some say v2007 was a leap ahead). Norton can be a real pain at times: some of its updates have reportedly been chaotic (a bit like Windows in that department, eh), and from my experience, it's poor at catching viruses; the excellent/good reviews it keeps getting every year is perhaps due to Symantec's aggressive marketing policies than anything else. I've used Norton longer than any other software (yes, any kind of software). The current version is 2008, but its update servers are, still, slow; and since its the bestselling antivirus in the world, all the hackers seem to be intent on exploiting its weaknesses. Moreover, its uninstallation process is (and has always been) terrible: it just won't remove all the files, so you have to do that manually. And, it's more RAM-hungry than many of the newbies that do the job much better.
Summary: grossly overrated antivirus, poor detection rate, uninstallation process doesn't remove files.
Price: $39.99 (1 user)
McAfee VirusScan Plus

McAfee product page

McAfee could certainly have boasted at one point of time for having bested Norton in popularity rankings. We used to think it was the most 'solid' antivirus program out there, and we were right back then. But excessive RAM congestion has always been McAfee's prime dislikeable feature. Even today if you're running McAfee on an average PC, you'll surely notice an overall performance drop in your desktop experience. Most other antivirus software have improved greatly, and McAfee can't really claim it's the best or hardest on viruses anymore -- probably in any department. The interface badly needs a sensibility implant (you'll get cautioned if you don't have McAfee Internet Security or Parental Control on). Perhaps the company needs such an implant too: how else would you explain failing to cash on the very useful and popular SiteAdvisor tool? And of course, SiteAdvisor alone can't be enough to hope users will stick around with the antivirus. Among the improvements, McAfee scans links while you're chatting away on AOL or Yahoo IM (though it doesn't support the latest versions till date). Funny: McAfee won't scan your computer upon installation, but once installed will proclaim you're protected
Summary: good detection rate, can scan links within some IMs; the big daddy of RAM hogs, clumsy interface.
Price: $39.99 (1 user)


Trend Micro AntiVirus plus AntiSpyware

Trend Micro product page

You might not think of PC-Cillin as one of the antivirus superweights, but a few years back TrendMicro had certainly won the hearts of many with its flagship product. Today the name has changed (it's ugly and longish), but it's still fast, and installs like a breeze. Updating is easy -- but the update server seemed slow. I remember trying everything I could sometime last year to clean my PC of a worm infection. PC-Cillin found it, but couldn't fix or delete it. In other words, the virus seemed to take over PC-Cillin in a snap, because after that PC-Cillin couldn't even find it! I didn't have the same virus on my computer, so I couldn't check how v2008 fared against it. In general, Trend Micro products have a history of failing to detect old viruses; even macro and boot viruses. The latest version has got a facelift, can scan links while chatting within AOL and Yahoo IMs (supports latest versions), seems faster, and certainly nails malware better. Independent reviewers haven't really shown much interest in reviewing Trend Micro yet, so you can't find any data on the internet. In other words, there's no statistical evidence that this antivirus performs better than another. From my experience, it feels like a much-improved piece of software.
Summary: good detection rate, can scan links within some IMs, good interface, fast; lack of performance data
Price: $39.95 (1 user)

-----------------------------------------------------------------------------

Monday, February 18, 2008

Introduction to SEO Tutorials

Before learning the techniques and theory behind search engine optimization, it is important to understand the basics behind search engine optimization. Below are some tutorials which help explain those basics as well as a brief explanation of the search engines upon which SEO is built. Also included are some technical papers about the search engine algorithms which the more advanced student of SEO may find useful.

What is search engine optimization?

Taking the search engine point of viewHow do the search engines determine relevance?

Basic link terminology


Technical papers relating to Google’s Algorithm:
PageRank
Topic Specific PageRank
LocalRank
Hilltop
Latent Semantic Indexing
TrustRank

How to optimize your site : SEO process

How to optimize your site for the search engines: an overview of the SEO process
Before we jump into the nitty-gritty details of search engine optimization, it is helpful to have a general idea of what it is that we do and why. There are many useful techniques that we will discuss about How to optimize your site for the search engines, but having a general idea of the process as a whole will help to make sense of those techniques, as well as put them in their proper context.
What is interesting about the SEO process as a whole, is that it really isn’t that complicated. When it comes down to it, there are five basic steps that you will need to take in order to properly optimize your site for the search engines. The success of your optimization campaign will largely rest on how well you execute these five steps. What that means, though, is that if you do learn how to implement all of these steps well (or hire someone who does) then you will most likely have a significant advantage over your competition (who may not be optimizing their site at all, or at least not properly).
Here, then, are the five steps to properly optimizing your site for the search engines:
  1. Create a search engine friendly web site
  2. Develop a well-organized list of popular, targeted keywords
  3. Optimize your web pages for those keywords
  4. Attract and/or obtain large numbers of relevant, high-quality links to those optimized web pages.
  5. Analyze and improve upon the results of your optimization campaign.

In later tutorials we will discuss each of these steps in and of themselves. For now, though, let’s get a general understanding of each of these steps and why they are important.

Create a search engine friendly web site The search engines have programs (popularly known as "spiders") which “surf” the internet, downloading websites and processing the information contained within (popularly known as crawling the internet). What is crucial to understand is that if these computer programs have trouble downloading or processing your web pages then those pages may never be included in any search ever done on the internet. Only pages which the search engines can process can have a ranking. All other pages are effectively invisible as far as the search engines are concerned. Therefore, your first task of order when optimizing your website for the search engines is to make sure that you create a search engine friendly website. Otherwise, all of your other optimization efforts may be effectively worthless.
Develop a well-organized list of popular, targeted keywords One of the unique aspects of internet marketing is that your customers are already actively looking for your products or services online. Your job is simply to place yourself in their path. You do this by discovering the terms and phrases which your customers use when searching for your products and services on sites such as Google, Yahoo, and MSN and then optimizing your site for those terms. However, given that your competitors are also looking to optimize their websites for these very same keyword phrases it is important organize your keyword list according to their level of competition. You then start optimizing your site for those keywords which have less competition. Later on you can optimize your site for the more competitive phrases.
Optimize your web pages for your keywords In order to optimize your website for your keywords you need to create unique web-pages, each of which are tightly focused on 1-2 keywords. Your title, header, meta, and alt tags should all include your main keyword phrases in them (don’t worry if you don’t understand all of these terms, they will all be explained in the relevant tutorials). You should also pepper the content of your web pages with those same keyword phrases. These pages are the ones that you want to rank well. Of course, you also want these pages to speak to your customers, so they must be written in a way which appeals both to the search engines and your visitors.
Attract and/or obtain large numbers of relevant, high-quality links to your optimized web pages Links are the backbone of high rankings. In order to rank well for your keywords you need to attract and/or obtain links from authoritative and/or popular websites which relate to your product or services and which link directly to your optimized web page. What’s more, whenever possible you want the link text to contain the keyword phrase which you optimized your web page for (the link text, also known as anchor text, is the hyperlinked text which is oftentimes blue and underlined, such as click here to read this tutorial). The search engines (Google in particular) love these kind of links as it says to them that your site is important (because important sites link to it) and that your web page is a main source for that particular keyword phrase (which is why people have the phrase in the link text of their links). Your overall link building strategy, therefore, is for each page on your site to obtain and/or attract enough of these high-quality incoming links so that it can dominate the search engines for the keyword that it is optimized for.
Analyze and improve upon the results of your optimization campaign Once you have setup your search optimization campaign it is crucial to analyze the results of that campaign. You want to note those areas wherein you were successful so that you can build upon that success. Similarly, you want to note those areas where you were not successful so you can avoid or correct faulty strategies or practices. The goal is to find those techniques which work for your particular products or services. This is only possible by constantly analyzing and fine-tuning your optimization campaign so that you can better utilize your time, money, and effort.
A final note, this process takes time. For some keywords you may have to wait as much as a year to see significant results. For the majority of (semi) popular phrases you will have to wait at least 2 - 3 months and most likely at least 3 - 6 months. In other words, search engine optimization is a slow process. The benefit, of course, is that if you successfully optimize your site you can draw large numbers of targeted customers to your web site for relatively little money. Of course, many businesses need a more immediate flow of traffic. For them, pay-per-click advertising (aka paid search) is the way to go. Paid search offers the benefit of driving targeted traffic to your site within a matter of minutes (for Google AdWords) or days (for Yahoo Search). On the other hand, paid search cost money. Often times it is worth running both a pay-per-click ad campaign as well as a search engine optimization campaign.
Read Other Articles

Basic link terminology

Link Basics - the fundamental information you need to know to understand what links are and how you can best use them to effect your position in the search engines.
To start with, there are 7 concepts which you need to understand. They are the following:


  1. Anchor Text

  2. Inbound links

  3. Outbound links

  4. Reciprocal links

  5. Traingular linking

  6. Link Popularity

  7. Search Engine Algorithm

Anchor Text - the visible text in a link.


Let’s take a look at a link - What is search engine optimization This linked text is a link to the Search Engine News sales page located at www.searchenginehelp.com/sales. What interests us about this code is that it leads to a particular page and that it uses text other than the actual URL (in our case Search Engine Optimization). This visible link text is known as anchor text. The search engines in general, and Google in particular, lend a great deal of significance to the anchor text when determining search rankings. As such getting your keywords into the visible, or anchor, text of the links pointing towards your site is one of the most important elements of high-ranking web pages.
Inbound LinksAn inbound link is a link from another site pointing towards one of your web pages.



Outbound links


An outbound link is a link on one of your web pages that points to a web page on someone else’s website.



Reciprocal Links


Whenever two sites agree to link to one another it is said that they are exchanging links, or that they have reciprocal links. At one time gaining a lot of reciprocal links was a good way to boost one’s rankings in the search engines. However, this system became greatly abused with many sites working hard to exchange links solely for the purposes of ranking well in the search engines. As a result, the search engines have greatly devalued the value of a reciprocal link, preferring one-way, inbound links. Still, there are times when exchanging links has some value, such as when you exchange links with "important" web sites.


Triangular Linking


Sometimes three different websites will attempt to "outsmart" the search engines by linking to each other in such a way that none of them "reciprocate" the link that they receive. For instance, site A may link to site B, site B to site C and site C to site A. That way each of the sites receives a one way inbound link. The search engines are rarely fooled by such schemes and will often penalize sites that participate in them. Remember, the search engines have available to them a great deal of cash and talent which they can dedicate to exposing and undermining such schemes. It is not for nothing that they hire computer science engineers with PhD’s from the top universities.


Link PopularityLink


popularity measures the sheer number of incoming links to a website. The search engines used to place a great deal of importance on link popularity when determining search results based on the assumption that quality web pages attracted a large number of incoming links. However, as websites began to take advantage of this fact and engage in various schemes to attract hundreds, if not thousands, of irrelevant incoming links, the search engines placed less importance on link popularity alone and started to also evaluate the quality of each incoming link.



Search Engine Algorithm


Each search engine has a set (or a number of sets) of preset rules which it employs to determine which sites show up for any given search query as well as the order that those sites appear in. These rules are known as an algorithm, and each search engine employs it’s own, unique algorithm. One of the more famous algorithms is Google’s PageRank which, roughly speaking, measures the importance of any given webpage. Today there are a number of newer algorithms which one should also be aware of, such as Hilltop, LocalRank, Latent Semantic Indexing and more. Almost all of these new algorithms relate to evaluating the quality of an incoming links, as links are one of the most central factors used in the search engines algorithms (this is particularly true with regards to Google). Needless to say, it is extremely important to keep abreast of the changes that the search engines make to their algorithms as these changes affect the strategies and techniques one needs to employ to rank well in the search engines.

Read Other Articles :How to optimize your site : SEO process ,Basic link terminology ,Taking the search engine point of view: why you wa... ,What is search engine optimization (aka SEO)? ,The best keyword research tools available ,Antivirus reviews 2008 (part 1) ,Strange Google Crawler ,Creating Statspack job

Taking the search engine point of view: why you want whatever the search engines want.

Let’s take a second and try and look at the internet from the search engines point of view. Why do the search engines provide us with this invaluable service of allowing us to search the internet? The answer, not surprisingly, is money. The more users that a search engine has, the more potential they have to make money. But how, and what does this have to do with search engine optimization?
The answer to the first question is simple enough, ads? Search engines make money by showing ads along side their search results. For instance, imagine that you did a search on Google for free cell phones. You would arrive at a page that looks like this:



On the left hand side of the screen there are what is known as the organic search results. These are the websites which Google thinks are the most relevant site for the term free cell phone (note: everything which we are about to say about Google also applies to the other major search engines: Yahoo, MSN, and Ask.com). Furthermore, Google does not take money for showing these results. They are displayed based on Google’s algorithm (a mathematical formula which Google uses to determine how to rank sites for any given search term). On the right hand side of the page are different results which also show up for the term free cell phone. These are ads, or more exactly Google AdWords ads, and the owner of that ad pays Google every time someone clicks on their ad (the amount varies according to various factors).
What is worth noting is that Google’s entire business plan revolves around people clicking on these ads. As such, the more people who use Google’s search engine, the more money Google will make (as more people will see Google’s ads with a certain percentage of those people clicking on them). Thus Google has a vested interested in providing the highest quality search engine that they possibly can (as do the other major search engines), for that is what drives people to their site. And the key ingredient to a quality search engine is relevance! After all, people are only interested in a search engine insomuch as it helps them find the results that they are looking for. And since that is what Google’s customers want, that is what Google wants. And since that is what Google wants, that is also what we want.
Here comes the answer to our second question. When we say that Google ranks the search results according to relevancy what we really mean is that Google has developed various criteria and methods for determining what is the most relevant site for any given search. What this means for us is that if we can discern what those criteria and/or methods are (Google doesn’t reveal them) then we can build our site accordingly for the terms that we want to rank well for. Put simply, Google sets the ranking rules. If we want to rank well then we best learn what those rules are and follow them. This, in a nutshell, is all that search engine optimization is about.
As simple as that may sound it’s actually a bit harder to do in real life. Particularly since the search engines are constantly trying to improve the results that they return. What that means is that the criteria and methods that the search engines use to rank sites are constantly changing. So not only is it important to know how it is that the search engines rank sites today, but it is equally important to get as clear a sense as possible as to how they plan to rank sites tomorrow. That way you can always be prepared (or at least try to be prepared) for whatever changes come tomorrow. Thus ensuring that your high rankings are as stable as can be.
Our first order, therefore, when it comes to optimizing our sites for the search engines is to understand as best as possible the criteria and methods that the search engines use to determine their search results. As such, that is the topic of the next tutorial.

Previous tutorial: What is search engine optimization?

Read Other Articles :How to optimize your site : SEO process ,Basic link terminology ,Taking the search engine point of view: why you wa... ,What is search engine optimization (aka SEO)? ,The best keyword research tools available ,Antivirus reviews 2008 (part 1) ,Strange Google Crawler ,Creating Statspack job

What is search engine optimization (aka SEO)?

This post includes an article. To read the article, scroll down below to the article).
Read the article
Imagine that you’re a cell phone vendor and that a potential customer is searching on Google for the phrase “free cell phone”. That customer will most likely reach a page like this:


There are three parts of this page that we need to note. The first is the search term (in this case “Free Cell Phones”), the second are the paid ads and the third are the natural search results. See image below:

For these tutorials we are only interested in the search term and the natural search results. We can forget about the paid ads as they have nothing to do with search engine optimization. You don’t optimize your site for the paid ads, you bid for them. They are part of Google Adwords program and will be discussed in our Paid Search (aka pay-per-click) tutorials.

Search engine optimization only deals with what are called “natural” or “organic” search results. Nobody pays to show up in the natural results. The results displayed in this section are there only because Google “decided” that they are the best pages on the Web to display for the search term “free cell phones”. If you entered a different search term you would see different results.

Now, with that said, let’s return to our imaginary cell phone site. As a cell phone vendor, your goal would be to have your web site show up in the top 10 - 20 search results for all searches related to cell phones (most people don’t scroll down beyond the top 10 - 20 search results). By showing up in the organic search results for the major search engines (Google, Yahoo, MSN, and Ask.com) you can bring in large numbers of potential customers to your website…..for free!

On the other hand, imagine the business you’re missing out on by ranking poorly in the search engines. This is why so many people spend a great deal of time, money, and effort trying to get their sites to rank well in the search engines.

You’re probably wondering, what do I have to do to rank well for the terms and phrases related to your site? Simple, you have to optimize your website for the search engines. That means you have to figure out how the search engines determine their results for any given search and then build your website accordingly. That’s what we mean by “optimization”: building and marketing your website to rank well within the major search engines.

This brings us to our next article: How do search engines rank and determine the order of their search results?

next Post/Tutorial:Taking the search engine point of view

Sunday, February 17, 2008

The best keyword research tools available

Here is a short list of some of the best keyword tools available:

Free keyword tools

1. Overture Keyword Suggestion Tool (aka Yahoo’s Search Term Suggestion Tool - STST)
The Overture keyword tool displays results from the previous month for keywords searched for within the Yahoo network.

2. Google Adwords Keyword Tool (GAKT)
The Google Adwords Keyword Tool displays results for keywords searched for within the Google network.

3. Microsoft AdCenter Ad Labs Tools
Microsoft AdCenter offers a variety of interesting and potentially useful tools to assist you in your keyword research.

4. Google Suggest
As you type your search, Google Suggest offers keyword suggestions in real time. Useful for learning what terms are searched for on Google.com.

5. Associated keywords — Yahoo, Ask.com, Clusty, and Gigablast all offer suggestions for related terms that are typically associated with a given keyword.

  1. Yahoo (www.yahoo.com) - when you perform a search on Yahoo they will often times suggest related keywords above the search results.
  2. Ask.com (www.ask.com) - when you perform a search on Yahoo they will often times suggest related keywords to the right of the search results.
  3. Clusty (www.clusty.com) - when you perform a search on Yahoo they will often times suggest related keywords to the left of the search results.
  4. Gigablast (www.gigablast.com - when you perform a search on Yahoo they will often times suggest related keywords above the search results.

6. Quintura
Quintura is not exactly a keyword research tool, but it’s close enough. What’s more, it may be better than a keyword research tool. Quintura gives you a list of keywords which are associated with your main keyword. To learn why this is helpful, read this article!

Keyword Tools Which Cost Money

1. Wordtracker
Wordtracker displays results for keywords searched for over the last 90 days in two meta-search crawlers (Dogpile, www.dogpile.com and Metacrawler, www.metacrawler.com). A meta-search crawler searches the most popular search engines (such as Google, Yahoo, MSN, Ask.com, and more) and retrieves the “best” combined results.

2. Keyword Discovery
KeywordDiscovery displays search statistics from over 180 search engines world wide (Keyword Discovery obtains their data either by importing the search logs of various search engines or by collecting samples of those searches engines by scraping search statistics from ISP logs and other sources).

3. Keyword Intelligence
Keyword Intelligence claims to display the most popular search terms for your industry so as to help you identify the keywords that have worked for your competition.

Keyword Elite
Keyword Elite can help you to build and analyze large, relevant keyword list. An extremely useful tool when used properly.


Link to Previous post:Antivirus reviews 2008 (part 1)
Link to this Blog:Random-View


Saturday, February 16, 2008

Antivirus reviews 2008 (part 1)

Intro Linux users are lucky: there aren't really many viruses that target them. Even Mac folks don't have to lose sleep over virus attacks too often. But life for us Windows people is different. Much of our digital worries hover about the possibility that a single malicious program could turn the world upside down.To make sure I'm insured for the next digital doomsday, over the past few weeks I've checked out all the major brands of antivirus software, including free antivirus software (all latest versions till date). My poor PC had to go through a horrible degree of abomination, but in the end my finds were sure worth it. First of all, you should know that
this isn't a data-centric report. I neither have the intention nor the resources to come up with that kind of thing (you can check out Download.com, About.com, Consumer Search, PC Magazine and PC World for stats; amazingly, the test results differ across these reviews even when they're using the same parameters. Also, google to look up a particular antivirus version review).
what I've come up with is a plain-talk user account. However, I've tested (read 'applied') different home user versions of standalone antivirus software on my PC under the same conditions (courtesy of Acronis TrueImage; OS: Windows XP).
all prices mentioned are in US$ and generally include a 1-year virus definitions subscription.All the major reviews this year seem to be bent on hailing BitDefender or Kaspersky. You should know that each year it's different; a few years back the tussle was between Norton and McAfee, and after that PC-Cillin and Kaspersky broke in. There always seems to be a general drift towards celebrating a particular antivirus brand. But trust me: they're almost always wrong. Or they do all their testing on alien PCs (no pun intended).
check latest my post random-view

BitDefender

check Product Page BitDefender product page TOP
This year's champ won many hearts with its price tag: a one year subscription for 3 users (yes, that wasn't a typo) costs only $23.96. BitDefender is a vastly improved product in its current incarnation (the current version is 2008), although version 9 (the last one I used) was good too. BitDefender did a good job cleaning up my PC; it looks sleek these days, and scan speed seems to have slightly improved. However, my chief complaint against BitDefender is it eats up a lot of your RAM. And a hell lot: in fact, if you're using an older machine you might even think your PC has crashed for good (I tested it on a 2.0 -something GHz Celeron, and it's going to curse me for the rest of its days).
Summary: good detection rate, great price tag, improved interface; slow scan speed, RAM hog.
Price: $23.96 (for 3 users)
check latest my post random-view




Kaspersky

Check Product Page Kaspersky Labs product page TOP
I've used Kaspersky since its infancy, and the thing I like about it is the way it updates its database. Kaspersky responds to security threats fast, and scans all Internet traffic in real time to block viruses before they are saved to disk. Kaspersky isn't impenetrable, as its fans (including myself) used to believe. But then again that faith stems from the fact that it's so good at catching viruses -- if not the best. It scans well and fast (unless you're using the highest settings), and one might argue that the relatively high $59.95 price tag for a single user license is worth it.The interface is better than before, but could be better; it seems to eat up more RAM than its previous versions. The problem with Kaspersky is an almost silly one: its update mechanism often fails, leaving you in minutes of sluggish online experience; worse, it will then keep nagging you even if your virus database is only a few days old, making the situation seem much worse than it really is (many users, especially people who use dial-up, rely completely on weekly, bi-weekly or even monthly virus definitions downloads). The situation really gets on your nerves when you discover that Kaspersky is downloading all the files it needs, but somehow can't update its database. While you can adjust the way Kaspersky updates, it's worth pointing out that the auto-update is perhaps its prime feature. You get the idea. The latest version is 7.0.

Summary: good detection rate, quick to respond to virus outbreaks; update mechanism acts weird at times, eats more RAM than previous versions, interface could be better.

Price: $59.95 (1 user)
check latest my post random-view


ZoneAlarm Antivirus

Check Point product page (antivirus) (security suite) TOP

ZoneAlarm is perhaps best known as a great firewall, but these days the company (Check Point) has started offering an antivirus as well. It's essentially a variant of Kaspersky (version 6.0?), but it's not as good. It scanned really, really slowly; it doesn't have enough options for scans; it fared poorly in cleaning up registry entries generated by malware; it came up with false alerts (an area where other antivirus programs have improved greatly); and I really can't trust Check Point on effective user support or disaster management, because they're into firewalls, not antivirus software (and, the antivirus they made sucks). The only reason I'm discussing ZoneAlarm Antivirus at all is it's part of the ZoneAlarm Internet Security Suite (v7.1) -- and the bundle comes pretty cheap at $49.95.
Summary: good detection rate, not heavy on the wallet (as part of the ZA Security Suite); slow scan speed, can't clean up leftover registry entries, lacks scanning options, Vista support issues.
Price: $29.95 (1 user); $49.95 for ZA Security Suite (1 user)
check latest my post random-view

Panda



I liked the looks of Panda Antivirus: any antivirus software that's called 'panda' deserves praise :D Panda Antivirus (v 2008) installs quickly, and its real-time scanner is good and even underrated. But the good part stops there. Panda doesn't seem to respond to new threats quickly enough, and once installed, your PC takes an annoying while to boot (not to mention the panda head icon that appears on the bottom-right corner of your screen and gets on your nerves soon enough. You might even end up appreciating pandas less). Independent reviewers seem to be uninterested in Panda as well, which makes it hardly a popular choice. Worst part: Panda eats up a lot of your RAM.
Summary: good detection rate, affordable; slow scan speed, RAM hog extraordinaire.
Price: $39.95 (for 3 users, 1 year)

check latest my post random-view


F-Secure

Check Product Page F-Secure product page TOP



I ended up using F-Secure for almost the entire trial period. The
new version is massively improved and it detects viruses fairly well (reviews have traditionally underrated F-Secure's engine). It
updates quickly and frequently (in small files, a lot like Kaspersky)
too. The reason I gave up on it is it eats up a lot of RAM. In that
respect, it shares the same curse as BitDefender. It's also pricey.

Summary: good detection rate, good update mechanism; pricey, RAM-intensive.
Price: $97 (for 3 users; the price is actually €65.90 on the company website)
check latest my post random-view


Next: (part 2: the biggies) will Post it soon


Link to Next Post:The best keyword research tools available
Link to Previous post:Strange Google Crawler
Link to this Blog:Random-View