< profitable-website: November 2006

profitable-website

Wednesday, November 01, 2006

Making your website attractive, interesting and engaging

Introduction
To succeed at your online business (whether you are sellingyour own product/service or are selling for other merchantsas an affiliate), you need a Web site created just for that- a simple, focused site. One that is easy to build,maintenance-free, low cost, credible, and a powerfultraffic-builder and customer-converter.
Having the right tool and the right product alone doesn’tinsure the success of your website. There are many factorsto be considered while designing a site. Unfortunately, mostof these are easily ignored by Internet business owners.

Build It for Speed
It's a fact of modern life - people are in a hurry. Thismeans that you have between 10 and 30 seconds to captureyour potential customer's attention. Tominimize your loadtime, keep graphics small. Compress them where possible. Useflashy technology (JavaScript, Flash, Streaming Audio/Video,animation) sparingly and only if it is important to yourpresentation.

Target your Market
Know who your market is and make certain that your sitecaters to their needs. It is critical that your site reflectthe values of your potential customers. Is your marketmostly business professionals? If so, the site must be cleanand professional. Is your product aimed mostly a teenagersand young adults? Then your site could be more informal andrelaxed. The key here is to know your market and build thesite to their preferences.

Focus the Site
Make certain your web site is focused on the goal, sellingyour product or service. A site offering many unrelatedproducts is not necessarily unfocused, but this is often thecase. If your business does offer many products, dedicate aunique page for each instead of trying to sell them all fromone page.

Credibility Is Crucial
The most professionally designed site won't sell if yourcustomers don't believe in you. A clear privacy statementis one way to build your credibility. Provide a prominentlink to your privacy statement from every page on the siteas well as from any location that you are asking yourvisitors for personal information. Provide legitimatecontact information on line.

Navigation should be Simple
Make site navigation easy and intuitive. Simple and smoothnavigation adds to the convenience of the visitors. Addpowerful search and catalog features. Many times a lot ofvisitors do not have the patience to navigate through thewhole website to find what they are looking for.

Consistency is the key
Make sure the site is consistent in look, feel and design.Nothing is more jarring and disturbing to a customer thanfeeling as if they have just gone to another site. Keepcolors and themes constant throughout the site.

Make your site interactive and personalized
Make your website interactive. Add feedback forms as wellas email forms that allow your prospective customers to askyou any questions they might have pertaining to a product.Personalization of your website is another key element thatcan lead to customer delight and can increase your sales.Personalization technology provides you the analytic toolsto facilitate cross-selling and up-selling when the customeris buying online. It would give you an idea of what productsto cross-sell and up-sell. For example, when a person buys aCD player, a disc cleaner can also be offered.

Content is King
Good content sells a product. Ask yourself the followingquestions. Does your copy convey the message you wish to getacross to your visitors? Is it compelling? Does it lead yourvisitor through the sales process? Have others review,critique and edit your copy to insure it is delivering theintended message. Always double check your spelling andgrammar.

These eight, simple rules will go a long way toward theimprovement of your website and most importantly, turnvisitors into customers.

Top Search Engines 2

Lycos
Lycos is one of the oldest search engines on the Internettoday, next to Altavista and Yahoo. Their spider, named"T-Rex", crawls the web and provides updates to the Lycosindex from time to time. The FAST crawler provides resultsfor Lycos in addition to its own database.
The Lycos crawler does not weigh META tags to heavily,instead it relies on its own ranking algorithm to rank pagesreturned in results. The URL, META title, text headings, andword frequency are just a few of the methods Lycos uses torank pages. Lycos does support pages with Frame content.However, any page that isn't at least 75 words in content isnot indexed.

Excite
Excite has been around the web for many years now. Much moreof a portal than just simply a search engine, Excite used tobe a fairly popular search engine, until companies such asGoogle seemed to have dominated the search engine market. Asof recently, Excite no longer accepts submissions of URL's,and appears to no longer spider. To get into the Excitesearch results, you need to be either listed with Overtureor Inktomi.

Looksmart
Getting a listed with Looksmart could mean getting a goodamount of traffic to your site. Looksmart's results appearin many search engines, including AltaVista, MSN, CNN, andmany others.
Looksmart has two options to submit your site. If your siteis generally non-business related, you can submit your siteto Zeal (Looksmart's sister site ), or if you are abusiness, you can pay a fee to have your site listed. Eithermethod will get you listed in Looksmart and its partnersites if you are approved.
Once you have submitted your site, and it is approved forlisting it will take up to about 7 days for your site to belisted on Looksmart and its partner sites.

AOL Search
America Online signed a multiyear pact with Google for Websearch results and accompanying ad-sponsored links, endingrelationships with pay-for-performance service OvertureServices
and Inktomi, its algorithmic search provider ofnearly three years

Take some time to register with these search engines as soonas possible and watch the traffic grow.

Top Search Engines 1

Google
Google has increased in popularity tenfold the past severalyears. They have gone from beta testing, to becoming theInternet's largest index of web pages in a very short time.Their spider, affectionately named "Googlebot", crawls theweb and provides updates to Google's index about once amonth.
Google.com began as an academic search engine. Google, byfar, has a very good algorithm of ranking pages returnedfrom a result, probably one of the main reasons it hasbecome so popular over the years. Google has several methodswhich determine page rank in returned searches.

Yahoo
Yahoo! is one of the oldest web directories and portals onthe Internet today, and the site went live in August of1994. Yahoo! is a 100% human edited directory, and providessecondary search results using Google.
Yahoo! is also one of the largest traffic generators around,as far as web directories and search engines go.Unfortunately, however, it is also one of the most difficultto get listed in, unless of course you pay to submit yoursite. Even if you pay it doesn't guarantee you will getlisted.
Either way, if you suggest a URL, it is "reviewed" by aYahoo! editor, and if approved will appear in the next indexupdate.

AltaVista
Many who have access to web logs may have seen a spidernamed 'scooter' accessing their pages. Scooter used to beAltaVista's robot. However, since the Feb 2001 site update,a newer form of Scooter is now crawling the web. Whicheverspider AltaVista uses, it is one of the largest searchengines on the net today, next to Google.
It will usually take several months for AltaVista to indexyour entire site, although the past few months scooterhasn't been deep crawling too well. Unlike Google, AltaVistawill only crawl and index 1 link deep, so it takes a goodamount of time to index your site depending on how largeyour site is.
AltaVista gets most of its results from its own index,however they do pull the top 5 results of each search fromOverture (formerly Goto).

Inktomi
Inktomi's popularity grew several years ago as they poweredthe secondary search database that had driven Yahoo. Sincethen, Yahoo as switched to using Google as their secondarysearch and backend database, however Inktomi is just aspopular now, as they were several years ago, if not more so.Their spiders are named "Slurp", and different versions ofSlurp crawls the web many different times throughout themonth, as Inktomi powers many sites search results. Thereisn't much more to Inktomi then that. Slurp puts heavyweight on Title and description tags, and will rarely deepcrawl a site. Slurp usually only spider’s pages that aresubmitted to its index.
Inktomi provides results to a number of sites. Some of theseare America Online, MSN, Hotbot, Looksmart, About, Goto,CNet, Geocities, NBCi, ICQ and many more.

Secrets of Winning Traffic through Search Engines

How Search Engines work

Internet search engines are special sites on the Web thatare designed to help people find information stored on othersites. There are differences in the ways various searchengines work, but they all perform three basic tasks:
* They search the Internet -- or select pieces of the Internet -- based on important words.* They keep an index of the words they find, and where they find them.* They allow users to look for words or combinations of words found in that index.
Early search engines held an index of a few hundred thousandpages and documents, and received maybe one or two thousandinquiries each day. Today, a top search engine will indexhundreds of millions of pages, and respond to tens ofmillions of queries per day.

Spidering
Before a search engine can tell you where a file or documentis, it must be found. To find information on the hundreds ofmillions of Web pages that exist, a search engine employsspecial software robots, called spiders, to build lists ofthe words found on Web sites.
When a spider is building its lists, the process is calledWeb crawling. In order to build and maintain a useful list of words, asearch engine's spiders have to look at a lot of pages. Howdoes any spider start its travels over the Web? The usualstarting points are lists of heavily used servers and verypopular pages. The spider will begin with a popular site,indexing the words on its pages and following every linkfound within the site. In this way, the spidering systemquickly begins to travel, spreading out across the mostwidely used portions of the Web.

Indexing
Once the spiders have completed the task of findinginformation on Web pages, the search engine must store theinformation in a way that makes it useful. There are two keycomponents involved in making the gathered data accessibleto users:
* The information stored with the data* The method by which the information is indexed
In the simplest case, a search engine could just store theword and the URL where it was found. In reality, this wouldmake for an engine of limited use, since there would be noway of telling whether the word was used in an important ora trivial way on the page, whether the word was used once ormany times or whether the page contained links to otherpages containing the word. In other words, there would be noway of building the ranking list that tries to present themost useful pages at the top of the list of search results.
To make for more useful results, most search engines storemore than just the word and URL. An engine might store thenumber of times that the word appears on a page. The enginemight assign a weight to each entry, with increasing valuesassigned to words as they appear near the top of thedocument, in sub-headings, in links, in the meta tags or inthe title of the page. Each commercial search engine has adifferent formula for assigning weight to the words in itsindex. This is one of the reasons that a search for the sameword on different search engines will produce differentlists, with the pages presented in different orders.
An index has a single purpose: It allows information to befound as quickly as possible. There are quite a few ways foran index to be built, but one of the most effective ways isto build a hash table. In hashing, a formula is applied toattach a numerical value to each word. The formula isdesigned to evenly distribute the entries across apredetermined number of divisions. This numericaldistribution is different from the distribution of wordsacross the alphabet, and that is the key to a hash table'seffectiveness.

The search engine software or program is the final part.When a person requests a search on a keyword or phrase, thesearch engine software searches the index for relevantinformation. The software then provides a report back to thesearcher with the most relevant web pages listed first.Is Your website search engine friendly? If you have anydoubts, it may be time to take a look and make your own “bigbreak”.