Monday, November 24, 2008

Promotional Web Hosting

I think this information can be useful for you. If you plan to get your website, here is one good free web hosting provider to choose - 000webhost.com

They provide hosting absolutely free, there is no catch. You get 1500 MB of disk space and 100 GB bandwidth. They also have cPanel control panel which is amazing and easy to use website builder. Moreover, there is no any kind of advertising on your pages.

You can register here: webhost

Web Hosting Service

I would like to tell you about free web hosting service I use now.
Register here: webhost

They give 1500 MB of disk space and 100 GB data transfer. I am now using them for about 3 months and never seen any downtime of server problems. There is no any kind of advertising on my pages too, so I think its worth to signup.

Professional Shared Hosting

If you wish to have a professional shared hosting quality in a free hosting package, come and host with 000webhost.com and experience the best service you can get absolutely free.

Founded in December 2006, 000webhost.com has a trusted free hosting members base of over 60,000 members and still counting! Offering professional quality hosting, support, uptime and reliability, we have a great community of webmasters, you’d love to be a part of!

Register now and get it all free:
*** 1500 MB of disk space
*** 100 GB of data transfer
*** PHP and MySQL support with no restrictions
*** cPanel control panel
*** Website Builder
*** Absolutely no advertising!

Join us now: webhost

Web Hosting Package With Advanced Ffeatures

Free cPanel Web Hosting with PHP5/Mysql - no advertising!
Register now: webhost

We can offer you a free web hosting package packed with advanced features for hosting & building professional dynamic websites. We provide secure free web space with all the web hosting tools you could possibly ever need.

Our package includes:
- 1500 MB of Disk Space, 100 GB Bandwidth
- Host your own domain (http://www.yourdomain.com)
- cPanel Powered Hosting (you will love it)
- Over 500 website templates ready to download
- Easy to use website builder
- Free POP3 Email Box with Webmail access
- FTP and Web based File Manager
- PHP, MySQL, Perl, CGI, Ruby.
- And many more..

Click here to visit us: webhost

Free cPanel Web Hosting

We offer completely ad-free cPanel web hosting. No ads will ever be forced onto our users webpages. There are no catches, no setup fees, no forced advertising, no banners, no popups and no hidden charges. Only totally free hosting service.

Register now, here is what waiting for you:
* 1500 MB Disk Space
* 100 GB Data Transfer
* cPanel Control Panel
* Website Builder
* 5 MySQL Databases
* Unrestricted PHP5 support
* Instant Setup!

This service is perfect for starting a new online community, blog or personal website! More info at webhost

Web Hosting With No Advertising

We offer Free Web Hosting with no advertising, there is no catch, so what you see is what you get. Here is our offer to you:

1500 MB Disk Space
100,000 MB Bandwidth
Free Subdomain or Your Own Domain
Automated Scripts Installer (20 Popular Scripts)
FTP Access and Web Based File Manager
Easy to use Website Builder
5 MySQL Databases with Full PHP Support
Zend & Curl Enabled
IMMEDIATE Activation!

Signup Now with Instant Activation - webhost

Free Host

Today I just found this free host with:

- 1500 MB of Disk Space
- 100 GB Bandwidth
- Your own domain hosting
- cPanel Control panel
- Website Builder
- Over 500 website templates ready for download
- Free POP3 Email Box and Webmail access
- FTP and Web based File Manager
- PHP, MySQL, Perl, CGI, Ruby.
- No Ads at all !

Check it out Here: webhost

Monday, November 17, 2008

Automated Bots: A Problem for Web Analytics

Anyone who has spent time analyzing web logs to a great extent as I have understands that automated bots, or web robots, create an enormous challenge for those who try to make a good business decision based on traffic trends.

An automated web robot is simply a crawler: a program that reads web pages and follows links on those pages to the next set of pages. These robots crawl the web from page to page to page gathering information. Robots have many different purposes for crawling the web:

* indexing web pages in a way that a search engine can find web pages relevant to the search being performed,
* gathering information from web pages (also known as screen scraping),
* learning business relationships between sites by understanding who is linking to whom,
* malicious gathering of email addresses for the purposes of sending unwanted email (spam), and
* for many other applications that the developer of a crawler has designed.

The problem that these robots create is that they leave "tracks" on your web site -- log entries created by your web server indicating that a visit to a web page on your site has occurred. These entries confuse your web analytics engine into thinking that this is an actual human visitor visiting your site. If you try to make a business decision based on behavioral patterns of visitors to your site, these tracks from automated robots, when mixed with tracks left by real visitors to your site, have potential to introduce enough bad data to lead you to a wrong decision and consequently wrong actions.

These robots are very easy to develop and as a result, they are everywhere. Unlike human visitors to your site, they are not going to make a purchase on your site, purchase from your advertisers, or subscribe to your newsletter or RSS feed. You basically want to eliminate these robots from your web analytics engine. However, this is not so simple.

The Internet community introduced an advisory standard in 1994 known as the Robots Exclusion Protocol (see Wikipedia definition) which introduced rules that all web robots must follow. One such rule is to read the "robots.txt" file on your server which has the opportunity specify which pages should never be read by the robot. Placing such an exclusion file on your server is, however, impractical for the purposes of improving web analytics for a couple of reasons:

* Restricting robots prevents the Google, Yahoo, and other search engine robots from crawling and indexing your site. This prevents your pages from being listed under natural search, which is a problem far larger than the confused web analytics.
* Many robots, especially those that have malicious intent, will completely ignore the robots.txt exclusion file. Since the Robots Exclusion Protocol is advisory in nature, robots can simply ignore the file completely.
* If your site is on a public hosting site, such as blog-hosting service, you may not have permission to create your own robots.txt file and place it there.

Furthermore, robots take many steps to make their visits look just like visits from human visitors. While Google and other search crawlers identify themselves by placing an unambiguous User Agent (see Wikipedia definition) which web analytics engines can strip out, many web robots identify themselves as a typical browser, such as Internet Explorer or Firefox. This spoofing makes it especially difficult for your analytics engine to differentiate between a human and a robot visit.
Despite being very clever in their attempt to be seen as human visitors, robots typically have one significant limitation: the vast majority of the robots do not interpret JavaScript present on your site. This limitation is largely due to the fact that developing a JavaScript interpreter is far more complex than simply developing a crawler that traverses HTML links. The ramification of this limitation is that while a standard browser being used by a human visitor will load the HTML of the web page and then immediately load all other resources, including those served by JavaScript, robots will load the HTML of the web page, perhaps graphics on the page as specified by HTML and that's it.

As a result, if you implement your web analytics using a JavaScript-based tool, you are likely to all but eliminate the problem of web robots polluting your analytics data. Analytics Wizard from AnalyticsWizard.com is an example of a free JavaScript-based web analytics tool which I developed and suggest you try.

Keep in mind that even the JavaScript solution to preventing robot polution is short-lived. If browsers have the technology to read JavaScript, eventually, robots will obtain such technology as well and we will need to find better means of filtering them out. However, it is a good bet that there will always be far more robots which do not read JavaScript than the ones that do.

Web Analytics is Recession Proof?

For the past few weeks I have been thinking about the economy and trying to reconcile two seemingly contradictory observations:

1. The economy sucks, and it doesn’t seem likely to improve anytime very soon
2. The web analytics sector is reportedly recession-proof and, in fact, predicted to grow in 2009

While I hardly need to provide any proof of the first observation, evidence for the latter has been emerging from a variety of voices in our community for the past few months. Case in point:

* In January, the Web Analytics Association reported that nearly 70% of companies planned to continue to invest in new tools and staff throughout 2008
* Back in July, Corry Prohens from IQ Workforce wrote in a guest post in this blog that “74% of practitioners expect that spending on web analytics will increase at their company during the recession”
* In September, my good friend Jim Sterne surveyed the web analytics crowd and found that 87% of practitioners plan to maintain or increase their budgets for web analytics tools and services in the face of the current economy, and that nearly twice as many respondents indicated they planned to increase budgets for web analytics (21%) as planned to decrease same (13%)
* Last week, the fine folks at E-consultancy splashed all across the news with the headline “Web Analytics: A Silver Lining in the Recession Cloud” by reporting that web analytics was poised to grow in 2008 by 12%

In the E-consultancy report, the organization’s head of research Linus Gregoriadis was quoted as saying: “The profile of Web analytics continues to grow as it becomes more integral to business decision-making and organisational strategy. The credit crunch is putting the spotlight on analytics as organisations work harder to understand where they are getting the best return on investment and where real value is being added.”

Recently, Josh James, the CEO of Omniture said something similar during the Q&A portion of the company’s Q3 earnings call in response to a question about whether businesses saw web analytics as discretionary:

“Every dollar that a marketer has, I think everyone has in every organization is under pressure right now and certainly marketing spend is where CFOs like to look and see if they can cut. But, what we’ve seen with our customers is their online channels are the ones that are performing the best. Their online channels are the ones that are giving the most direct impact within that quarter that spend is also taking place.

In terms of the way that they think about Omniture, even if they cut let’s say 10% of their marketing spend, they’re going to use us to a) identify the 10% they’re going to cut and b) use us to optimize the other 90% to try to get back up to the same results as they had with the 100% the year before. These kinds of times actually drive usage of our product.

When things are good it’s a lot easier when you want more sales just to throw more money at the top of funnel and to generate more leads and go through the process. When things get bad people try to focus on of everyone that’s already coming to our store, what can we do to keep them more attracted? What can we do to get them to look at other things? What can we do to get them to read additional articles? All of those behaviors drive uses of our product.”

All of this sounds absolutely spectacular. Except for one thing …

I’m not sure I believe any of it.

I think that we are collectively starting to suffer from the echo chamber effect, essentially reiterating that web analytics will be fine in this lousy economy because, unsurprisingly, we are all making money off of web analytics and we would very much like to continue doing so. The WAA, IQ Workforce, my friend Jim, E-consultancy, Omniture, me … our collective businesses are all more or less explicitly tied to continued investment in the sector. So why wouldn’t we look for data that suggests that the picture continues to be rosy and the future bright?

Why indeed.

In terms of the data presented above, as a former researcher I would offer this assessment: many of these surveys appear to suffer from sample bias. Asking the Yahoo! group, members of the Web Analytics Association, or the audience attending Emetrics about their interest, investment, or organizational focus on web analytics is kind of like asking your average Democrat in Portland, Oregon how they feel about Barack Obama. The problem is not the audience, the problem is the interpretation: I think it is misleading to extrapolate the responses from a non-random sample of businesses and business people to the larger audience.

This kind of sampling leads to claims like “52% of online marketing managers are currently engaged in A/B or multivariate testing …” Fifty-two percent implies that tens of thousands of online marketing managers are testing. Which sounds great, except that when Offermatica and Optimost were acquired by Omniture and Interwoven they had a few hundred customers between them, and Stephane Hamel’s WASP tool reports that 0.4% (zero point four percent) of the Top 500 online retailers are using easily detected A/B or multivariate testing tools.

Don’t get me wrong, I too have been guilty of sampling biased audiences, although in the past year I have stopped conducting primary research due to both the sampling issue and the plethora of free research that suddenly appeared in the marketplace.

Ultimately I’m suspicious of this optimistic data that we’re seeing, especially in the context of statements like this one made by Mr. James made on the earnings call referenced above about the effect the economy is having on Omniture’s ability to forecast Q4 and 2009:

“Towards the end of September however, it became apparent that the challenging macroeconomic and financial environment may have some impact on our business going forward although it remains difficult to quantify the uncertainties specifically.”

Mr. James and his CFO specifically don’t want to talk about 2009 on the call. Which makes sense to me, since here are some other data points:

* The economy sucks, and without belaboring the obvious, it appears that this suckiness will stay with us for quite some time;
* While I don’t question Mr. James assertion that his best customers make excellent use of web analytics, in my personal experience this is not universally true;
* Some of the largest consumers of web analytics products are starting to struggle;
* Despite the conventional wisdom that dictates that brilliant analysts are safe when times are tough, I am getting more and more calls from brilliant analysts who are being laid off or being offered severance packages to walk away.

It is this last point coupled with something I learned at Emetrics that has me the most concerned. In D.C. at Emetrics I heard Liz Miller from the CMO Council say that most CMO’s are a few years away from fully understanding the value of web analytics. If Liz is right, and her credentials are impeccible when it comes to the CMO’s office, then given the anecdotal evidence that continues to come in I wonder if web analytics is slightly more discretionary than we’d like to believe.

Don’t get me wrong, I sincerely hope to be wrong in this assessment. As an author, public speaker, evangelist, consultant, and conference co-producer focusing on web analytics I honestly hope to be able to write a follow-up post in six month saying, “Wow, I was really super-wrong about where the web analytics industry was going …”

But what can you do if I’m more right than not? What if you work in an affected sector or work for a company known for their web analytics acumen that is suddenly faced with bankruptcy or worse? What if the folks you work for who profess a great love for data-driven decision making are really HIPPOs in their heart and when the real bloodletting begins are just as likely to look for savings in areas that can be easily cut (human resources, for example) as opposed to those that would require breaking contracts?

What indeed.

If you’re in any way concerned about the current economy and your personal employment situation, here are five tips that I would offer to help you best prepare for the worst.

Tip #1: Focus on Increasing Profits, Not Minimizing Spend

My friend W. David Rhee just published a great response about the relationship between web analytics, sales, and marketing in a down economy. To paraphrase Dave, if the bosses begin to panic, you don’t want to be in a situation where you appear to be an expendable marketing cost that can be cut. It is far better to be focusing your analytical efforts on how the organization can be increasing profits, even if you have to fight to spend more time conducting analysis and less time generating reports.

Essentially you want to take Mr. James statement above to heart and work your butt off to optimize the lower-levels in your conversion funnel, working with what you already have, not what you might be able to attract. The good news is that the technology supports this analysis; the bad news is that more often than not, the deeper you get in the funnel, the more difficult optimization becomes for a variety of reasons, not limited to the business, IT, and “the way we’ve always done it!”

Be a profit center, be big picture, become truly invaluable.

Tip #2: Don’t Be a Report Monkey

The unfortunate reality about web analytics work is that far too many smart people spend far too much time generating far too many reports that far too people actually read and even fewer actually derive real value from. Sound familiar? When I started the conversation about process in web analytics in 2006 at Emetrics, over 80% of the audience said they spent too much time on “reports” and not nearly enough on “analysis” … sadly I’m not confident that things have changed much in the past two years, especially on a percent-of-practitioners basis.

There are any number of great posts about why reporting is over-rated and how the real value in web analytics comes from careful, business-focused analysis of the data, there are still too few companies that have put the hub-and-spoke model into practice and are able to effectively leverage web analytical resources.

My advice to to step-up and find the real value in your data, even if you have to conduct the analysis on your own in the wee hours. It’s not as if you can just stop generating reports (tempting as that may sound) but if you’re a good analyst, taking the time to figure out where the real opportunities to increase revenue are is the work you want to be doing anyway. Taking the initiative to make data-powered recommendations and presenting them is a good way to demonstrate your skills and commitment to the business (but don’t stop doing the job you’re being paid to do!)

Analysts conduct analysis and make recommendations. Be an analyst.

Tip #3: Start Watching the Job Boards

Even if you feel pretty good about the situation you’re in you have to admit that the most accurate term to describe the current economy is “dynamic.” In situations like this the worst thing you can do is be caught off guard and so I would offer that spending a little time surfing the Web Analytics Demystified Web Analytics Job Board (also see the WAA’s version) would be time well spent.

According to the nice folks at SimplyHired the number of job postings looking for “web analytics” experience of some kind continues to increase.

Friday, November 14, 2008

How to achieve success with web analytics

challenges of current web analytics

The successful implementation and execution of web analytics programs is challenging, especially in larger organizations. Web analytics success depends strongly on clearly defined business goals and because many organizations lack a clearly defined web strategy with clearly defined business goals, web analytics programs often suffer. This is an ongoing issue at the business level.

Common issues within organizations respect to web analytics.

* Management doesn’t want people to spend time on web analytics
* Management tends to think: purchase, install and read reports (aka analytics is magic)
* The discipline is rarely a dedicated operation — it’s a part time gig in many orgs
* Many practitioners are still struggling to make the business case.

it’s very bullshit, but true..
Concrete vs. Magic with Analytics

The perception that web analytics work by magic is prolific and problematic. Combating this need not be difficult, but some internal education is usually necessary. The following list of tasks is a solid way to start a web analysis project:

1. Collect business requirements
2. Define metrics and methods of collection
3. Find or make data available (e.g., coding tags, systems integration, etc.)
4. Calculating metrics — development of models
5. Build reports and conduct analysis
6. Educate stakeholders as to how to use the resulting analysis

Wise Use of Web Traffic Reports
Traffic Data Needs Context

Traffic reporting is not always useful, nor is it always wise to distribute such reports. When used well, traffic reporting provides snapshot views of important website activity. These reports serve to answer the question “how is the site doing” at a glance.

But traffic data has a context and that context is the previously defined web business goals. So for such reports to be useful one needs to provide as much context as possible: What are the goals? Which higher numbers are better? Which lower numbers are better? What does the vocabulary mean? Etc.
Big Numbers are Not Always Good

High traffic numbers are not necessarily a sign of success. If you’re running a portal and have 1.25 pages per visitor session, this could be a sign of success — users might be finding exactly what they need, quickly. But if you’re running a publishing site, then seeing 4 page views per visit would be a tremendous success — users are discovering interesting content and continuing to read and interact with the website.

Context is essential. Distributing traffic data without supporting contextual information can be worse than meaningless, it can distract from core business goals. Many traffic reports require deeper analysis to understand the implications.
Traffic Reporting is Not Analysis

For web analysis professionals, it’s important to differentiate (in the mind of the report consumers) the difference between web traffic reporting and web analytics reporting. And it’s important also to have good reasons, e.g., business goals, for distributing traffic reports.

Web analytics analysis typically goes far beyond traffic reporting and provides answers only available by drilling down into traffic reports integrating more sophisticated business data. For example, with a “Top 10 Search Terms (internal)” report, one very often wants to know what happened after the search was executed — …was the content found? …was there an exit event? …what happened next?
Analytics Dashboards Are Over Rated

Following from the discussion of web traffic reporting, Phil took a few swings at the use of web analytics dashboards. Like traffic reports, the point was made that dashboards often fail to be useful in terms of making business decisions. Dashboards tend to suffer from the following problems:

* They don’t often contain a proper explanation of use
* Context tends not to be explained well
* Reports get “thrown over the wall” without discussion or follow-up
* Baselines and goals may not be present
* They often are more about traffic than business activities

So what’s the answer? Identifying the stakeholders, understanding both their business interests and the way they consume information, and then delivering specialized reporting and/or data that fits them. One size fits all reporting can be self-defeating.

Doing better web analytics

By focusing on business goals, patterns and needs for information consumption, and deeper analysis, web analytics projects can deliver enormous value. Phil provided some guidance for how to succeed.
7 Business Questions Analytics Can Address

Keeping a focus on business questions is the foundation of a successful web analytics process. Some examples of these are:

1. What gets funded?
2. Which content or functional areas to deploy/fund human resources?
3. What projects to prioritize?
4. When is it time to redesign?
5. When to change content?
6. What navigation elements are working?
7. What search terms to purchase?

5 Questions for Web Analysts

When web analysts strive to refine their practice, Phil proposes that they ask these questions of themselves:

1. Who should be seeing raw data versus the resulting analysis?
2. When should we be analyzing versus reporting versus performing calculations on data?
3. How should we be presenting data and analysis to the different stakeholders?
4. What what data do we need to analyze?
5. Why are we analyzing this data?
6. Are we analyzing the key financial metrics (money talks at all levels)?

Explorative Web Analysis is a Must Have

Explorative web analysis is the process of understanding how people interact with a website or application. Our presenter today asserts that this level of analysis is key to the business value of web analytics. This form of analysis, according to Phil, involves drilling down on data sets to a level of detail that goes beyond standard traffic reports.

He states that typical analytics tools — such as Omniture, WebTrends, Unica, Nedstat and Google Analytics — may be adequate for this, but there are also often cases where one will need to access and manipulate raw data. This type of analysis involves activities like:

* Answering many of the typical questions raised by traffic reports
* Segmenting visitor activity to better understand the performance of content and the results of marketing campaigns
* Painting more rich pictures of how users are interacting with applications, content and navigation

Typical Explorative Analysis Exercises
General - All Types of Sites

Some broadly applicable explorative analysis operations include:

* Homepage analysis (e.g., content placement optimization)
* Internal search analysis (e.g., assess search usage and usability, identify ways to monetize search)
* Funnel and workflow analysis (e.g., identify fall out points and recommend process changes)
* Landing page analysis (e.g., analyze effectiveness of entry points with regards to moving visitors to and through key action funnels)

For Publishers

Online publishers have specialized needs. They are focused on driving visitors into specific content hotspots and optimizing the monetization of their content. Phil highlighted a few exploration examples for publishing contexts:

* Ad real-estate analysis — identifying the best locations for ads to be placed on a page
* Functional analysis — categorizing key site areas by function and create baseline measurements of each functional area
* SEO analysis — examine site entrances and site path behaviors segmented by key phrases.

Thursday, November 13, 2008

Search Engine Marketing

Search engine marketing is different from other internet marketing techniques. It is a passive form of advertising, requiring little outreach effort on your part. By choosing the right keywords and writing persuasive copy, you can effectively bring to your site potential customers who already want to buy.

The advantage of search engine marketing is that it allows consumers to find your business themselves. You don't have to pay for marketing to people who are not interested, and you don't risk getting a reputation for aggressive marketing techniques. Customers trust companies that they find on their own.

Search engines have gotten more sophisticated and continue to improve. Marketing through search engines is therefore evolving as well. Many of the improvements are rooted in new research about consumer behavior. Search engine marketing campaigns are increasingly targeted and effective as the technology and the techniques advance.

A successful web site will continue to draw customers after the first visit. If the content is engaging, traffic will increase through repeat visitors and word of mouth advertising. As traffic increases to your website, your organic search engine listing improves. As more people find your site through direct links or the unpaid search engine listings, your pay per click budget will decrease.

Most internet consumers turn to search engines to find or compare products they want. Create a strong search engine marketing campaign to tap into that market of buyers.

Even with a small budget and limited time, your business can find new customers through search engine marketing.

Market Research

Internet marketing is all about targeting your customers for the greatest return on your advertising investment. To target your market, you have to understand how consumers choose the products they buy.

Market research gives you the information you need to effectively appeal to your customers. It gauges consumer interest in your product and helps you identify where and how to advertise your product. Effective market research relies on collecting and analyzing reliable data about consumer habits and opinions.

You can perform your own market research by surveying your existing customers or inviting potential customers to participate in evaluating your products. By communicating with your customers, you can improve your products and service while building customer loyalty.

You can also install software that will track how users view and use your web site. Understanding the behavior of your web site visitors helps you advertise more effectively.

There are several types of market research that focus on the product itself. Demand estimation tells you if your product is sellable and how much competition you face.

Commercial eye tracking research determines how your product appeals to consumers. Test marketing lets customers try out your product and make recommendations for improvement.


Other market research methods focus on the consumer's perception of your company. Brand name testing, usually through surveys, evaluates your name recognition and your appeal with potential customers. Ad tracking, the most quantitative research method, evaluates whether viewers respond to your marketing techniques by following through or abandoning the sale.

Before investing in your internet marketing campaign, take time to really learn your market. Know what your customers are looking for and what they like. Targeting your customers through market research will prevent the waste of precious advertising resources.

Online Marketing - Decreasing Costs

The goal of online marketing is to have a high return on your investment. In other words, you want to make more money than you spend. An effective online marketing plan will help you do just that. Your marketing plan identifies your customers, your message, and your delivery.

The key to being successful in online marketing is to know your customers. Know what they want, where they are, and how to reach them. Offer them products or services that fit their lives and their consumer philosophy. For example, selling spangled thong bikinis to grandmothers over 80 through email written in an 8 point font is not likely to get a lot of response, or, at least, not a positive one.

Understand what products appeal to your chosen market, or conversely, which market wants your chosen product. Online marketing is not effective if you do not correctly identify your customers and create your message specifically for them. You may find that your customers fall into several distinct groups. In that case, create several different marketing packages, each targeted towards a specific group.

Make it easy for potential customers to find you. Use search engine optimization techniques to update your web site content and increase your search engine ranking. Implement a blogging for business plan to increase your expertise and the number of links to your page. Research affiliate marketing programs to see if one can get you higher exposure to your target customers.

The number of people buying products or services through the web is steadily increasing. An effective online marketing plan will help you tap into that growing market while keeping your advertising costs low.

Who are Your Website Visitors and What Are They Doing On Your Site?

One of the keys to being able to market your website is to know whose going there and what they are doing.

When armed with this information, you are able to market your site more effectively. You can also see which pages are getting hits and which pages are not. Simple Site Analytics

First and foremost you will want to check with your web hosting service to see if they offer site analytics with their hosting packages.

Many web hosting providers offer this with your hosting package and all you have to do is set-up your password.
These are generally very basic programs that let you know how many people visited your site on a daily, weekly and monthly basis. They can also provide you with hit information on various pages and the types of browsers that were being used.

Google also offers an excellent and free site analytics program. This program is very useful as it is web based and will even show you what parts of the country people are located in.

They can also provide you with information on visits by source. For example, 33% of your traffic may be coming from Yahoo! and another 33% is coming from a referral site, 16% is coming directly to your site, etc. This information is useful in helping you to determine where your marketing is really working at. The information You Receive

We'll use Google's Analytics as an example here. This program provides you with a visitor summary in four charts. Here you get a quick snapshot of the visits to your site. You see how many total number of visits and page views you have received as well as the number of first time visits and returning visits. You can also see which cities your visitors came from.

The marketing summary is also very useful. Here you will find your top five sources of traffic. You will also find the top five keywords that have been getting you traffic as well as your top five marketing campaigns. These will tell you which area is up and working good for you as well as which areas have seen a decline. These are known as conversion rates.
The marketing summary will also tell you the difference between organic, referral, not set and direct hits on your website. These are very useful tools when determining which of your marketing efforts have been working for you the best. Many programs will provide you with very similar information such as this.

You may also consider using more than one website analytics program so that you can get an even better view of whether or not your marketing campaigns have been working and where. Some programs may include better features than other and you can combine the information from them all to get a good overall view of your website.

Action Steps For Using Site Analytics

1. Determine the type of information you would like to receive from a website analytics program.

2. Begin searching for products that will fulfill your requirements.

3. Compare your marketing efforts to your traffic and page views.

Important Points About Site Analytics

* Don't forget to set your free site analytics up through your hosting service if it is available.
* Google offers many tools that can give you a different view of your traffic.
* Combine multiple services for the most information.

What Will Meta Tags and Robot.txt Files Do To Your Website?

As the web grows and the search engines are continuously getting smarter, we find that we need to learn as many tricks as possible to get our sites indexed. This tool allows you to pick and choose which pages are and are not indexed by the search engines.

Robot.txt File Versus Meta Tags

A meta tag is a tool that we use to choose which pages we want the search engines to index.

Many web developers will use the meta tag to also tell the search engines which pages they do not want to have indexed.

This code generally looks like this:

Some search engines do not use this tag and will completely ignore it. This is where the robot.txt file comes in hand. In this file you are able to list specific pages that you do not want to have indexed. These pages may include password protected folders and pages or images, for example. How Do I Make a Robot.txt File?

Begin by creating a simple Notepad text file and name it robot.txt. There are two conventions that you will use in the file to tell the search engines what you do not want them to index.

These are:

  • User-agent:* : The asterisk (*) is a very important part of this code. By using the asterisk you are addressing all of the robots that are trying to index your site. This is the easiest way to ensure that the files are not being indexed. You can target individual robots if you have the specific name or IP address for them, but in general you will want to stick with this form.
  • Disallow: / : After the disallow statement you will list any folders that you do not want to have indexed by the robots. If you do not list the files or folders have the slash (/) then that code tells the robots not to index any part of the website. You do not necessarily want to do this. All files and folders listed after this code will not be indexed.
    Robot.txt File Example

Here is an example of how you would use both of these commands. User-agent: *

Disallow: /tutorials/meta/
Disallow: /documents/images/
Disallow: /pages/404redirect/

In this example the asterisk is a sort of wild card. It addresses all of the search engine robots that would be indexing your site. Using the disallow command we are telling the search engine robots to not index the meta files, images, and 404 redirect pages in our site.


These files can be substituted for any files that you do not want to use. It is important to note here that tutorials, documents and pages are all folders, while meta, images and 404 redirect are files or actual web pages. If you do not use folders when developing your site you can leave out the folder name and only use the file name such as:
Disallow: /images/

This would direct the search engine robots to not index this file. On the other hand, if you have folders within folders and files within the inner folders, your command may look like:

Disallow: /documents/images/picture1.jpg

These files are very simple to use and would be added to your website directory with your regular website files.

Check out this sites for more robot.txt files examples

Action Steps For A Robot.txt File

1. Determine if there are any pages that you do not want the search engine to index.

2. Create a simple file using these simple commands in Notepad.

3. Upload the file to your website's file directory through your hosting company.

Important Points About Robot.txt Files

  • These pages will not be indexed. Therefore, any information or keywords on these pages would be irrelevant.
  • Most individuals choose to use this file to protect any information that they do not want the public to see.
  • Be sure to use the asterisk to ensure that you are addressing all search engines.

Use Search Engine Listing Services to Create a Bigger Web Presence for Your Site

When we think about search engines we usually think about the big guys. You know, Google, Yahoo! and MSN. These search engines are great, however, they are not the only ones that are used. There are actually several hundred search engines and you may be missing out if you are not getting yourself listed in as many as possible.

Search Engine Listing Services

There are numerous free search engine listing services available that will submit your site to about ten to fifteen different search engines.

This is great, but the problem here is that it can take a lot of time for the search engines to crawl around to your site, especially if it's not generating a whole lot of traffic.

So, what you may have to do is spend a little money. There are several search engine listing services that will submit your website on a monthly basis to these search engines in order to create a sense of urgency about getting to your site. Sites that tend to be popular tend to get the most attention.

You can submit your site to the search engines yourself by using their submit form, but how many search engines do you know of? You might be able to think of six or so search engines, but there are actually hundreds of them out there.

Sure Google is one of the most popular along with Yahoo! and the other big guys, but what about the search engines in different countries and the others such as Alta Vista that don't get used as often.

Some Internet users still prefer these search engines over the others as you can sometimes find more interesting sites that the big guys weren't interested in. These are the sites you are going to be targeting when you use a search engine listing service.

Getting Listed Doesn't Mean You Will Get a High Ranking


When you are first listed, you will not automatically receive a high rank for your particular keywords. This simply means that the search engines know you exist, which is the first step in the entire process. Search engine listing services are able to provide you with an easy to use tool that will get you listed and noticed initially and will continue to get you noticed. But, you have to continue to keep working at moving yourself up the list. You Get What You Pay For

You can use the free services and this is good to get you started, but you are almost always going to see more traffic from the paid submission services. These services also speed up the process of getting you submitted because you are a paying customer.

This means that you are going to need to set up a budget. This will depend on how fast you want to get listed. If you have a small budget, then you will find that Yahoo!'s human-compiled directory is one of the best to ensure that the major search engines pick you up. Paid placement programs are also an option. You are often able to get listed quickly, if you are willing to pay a monthly fee.

Essentially you are setting up a paid advertisement programs such as AdWords in Google and then as your site gets more hits from advertisements you can end up in the natural listings. After a good while and a lot of traffic you may be able to drop your paid advertising and you'll get noticed due to traffic to the site.

Important Points About Search Engine Listing Services

  • Just because you submit your site doesn't mean you are going to end up near the Top 10.
  • You will need to continually add and change the content on your site to keep the search engines coming back to you.
  • You may have to budget a good sum of money in order to be placed more quickly.

Who are Your Website Visitors and What Are They Doing On Your Site?

One of the keys to being able to market your website is to know whose going there and what they are doing.

When armed with this information, you are able to market your site more effectively. You can also see which pages are getting hits and which pages are not. Simple Site Analytics

First and foremost you will want to check with your web hosting service to see if they offer site analytics with their hosting packages.

Many web hosting providers offer this with your hosting package and all you have to do is set-up your password.
These are generally very basic programs that let you know how many people visited your site on a daily, weekly and monthly basis. They can also provide you with hit information on various pages and the types of browsers that were being used.

Google also offers an excellent and free site analytics program. This program is very useful as it is web based and will even show you what parts of the country people are located in.

They can also provide you with information on visits by source. For example, 33% of your traffic may be coming from Yahoo! and another 33% is coming from a referral site, 16% is coming directly to your site, etc. This information is useful in helping you to determine where your marketing is really working at. The information You Receive

We'll use Google's Analytics as an example here. This program provides you with a visitor summary in four charts. Here you get a quick snapshot of the visits to your site. You see how many total number of visits and page views you have received as well as the number of first time visits and returning visits. You can also see which cities your visitors came from.

The marketing summary is also very useful. Here you will find your top five sources of traffic. You will also find the top five keywords that have been getting you traffic as well as your top five marketing campaigns. These will tell you which area is up and working good for you as well as which areas have seen a decline. These are known as conversion rates.


The marketing summary will also tell you the difference between organic, referral, not set and direct hits on your website. These are very useful tools when determining which of your marketing efforts have been working for you the best. Many programs will provide you with very similar information such as this.

You may also consider using more than one website analytics program so that you can get an even better view of whether or not your marketing campaigns have been working and where. Some programs may include better features than other and you can combine the information from them all to get a good overall view of your website.

Action Steps For Using Site Analytics

1. Determine the type of information you would like to receive from a website analytics program.

2. Begin searching for products that will fulfill your requirements.

3. Compare your marketing efforts to your traffic and page views.

Important Points About Site Analytics

  • Don't forget to set your free site analytics up through your hosting service if it is available.
  • Google offers many tools that can give you a different view of your traffic.
  • Combine multiple services for the most information.