KMWorld CRM Media Streaming Media Faulkner Speech Technology Unisphere/DBTA
Other ITI Websites
American Library Directory Boardwalk Empire Database Trends and Applications DestinationCRM EContentMag Faulkner Information Services Fulltext Sources Online InfoToday Europe Internet@Schools Intranets Today KMWorld Library Resource Literary Market Place OnlineVideo.net Plexus Publishing Smart Customer Service Speech Technology Streaming Media Streaming Media Europe Streaming Media Producer Unisphere Research



News & Events > NewsBreaks
Back Index Forward
Twitter RSS Feed
 



Cautious détente pervades relations in the Web’s “Great Game”
by
Posted On December 6, 1999
Most Web searchers don't realize it, but there's a virtual cold war going on behind the scenes of the Internet. Reminiscent of Kipling's"Great Game," it's a rough-and-tumble struggle that pits Webmasters stakingclaims in cyberspace against the ostensibly objective Net cartographersattempting to create accurate and reliable guides of the Web.

Search Engine Strategies99, held in San Francisco on November 18, was essentially a high-levelsummit for key players in this Great Game, bringing Web content creators,marketers, and promoters face-to-face with representatives of the majorsearch engines and directories. Though they should be natural partners,content creators and the compilers of Web indexes and directories vieweach other with wary caution.

Ironically, the root causeof this antipathy lies with a third group of people, whose shadowy presencewas felt but unseen at the conference. These are the unethical Web siteoperators who subvert the process of compiling reliable Web guides by "spamdexing"the search services, overwhelming them with millions of bogus pages inmisguided and often futile attempts to draw attention to themselves.

That there's even a needfor a conference on a topic as fundamentally straightforward as creatingreliable indexes and directories dramatically illustrates the relativeyouth and chaotic nature of the Web. Though everyone at the conferencewas striving for similar goals, the lack of standards, consistent procedures,and even resources to keep up with the exploding growth of the Web madeconsensus among the players all but impossible.

Search Engine Strategies99 was sponsored by Internet.com and moderated by Danny Sullivan, the respectededitor of Search Engine Watch. The conference featured intensive"how-to" sessions for Web content creators, offering an impressive arrayof strategies and tactics for achieving prominent visibility in searchengines and directories. The conference concluded with panels featuringrepresentatives from the major Web directories and search engines, withbrief presentations and some lively question and answer interchanges. Hereare some of the key highlights of the conference.
 

Every Web Page Has Its Own Song
Danny Sullivan's keynotepresentation was an introduction to search engines and directories, focusingon the issues involved in getting Web pages listed and favorably placedin search results. To illustrate the importance of crafting "search friendly"pages, Sullivan related an amusing anecdote about his Welsh father-in-law'srugby team. After matches, all team members go to the local pub, and afterimbibing enough, each sings a song. Every man's song is unique, instantlyrecognizable as a personal trademark of sorts.

Just as "each man has asong, each page has its own song," said Sullivan. Web authors should isolateand focus on the unique characteristics of each individual page when optimizingthem for search engines. This will make them stand out from others and,in theory, rise to the top of search results for their unique keywords.

Shari Thurow, Webmasterand marketing director for GrantasticDesigns.com, told the audience thatshe had a 100 percent success rate in achieving top-20 listings in searchengines for her clients. She offered five rules of Web design for creatingsearch-engine-friendly pages: Make pages easy to read, easy to navigate,easy to find, consistent in design and layout, and quick to download.

Noting that search engineoptimization is becoming increasingly difficult as the Web grows larger,she urged the audience to consult with specialists. "Bring in a searchengine specialist early in the game," to save the time and expense of havingto undertake a costly redesign of a site, she said.
 

"Search Does Not Work On The Internet"
In the panel on "Meta Tags,"participants discussed the most effective ways to use "keyword" and "description"metatags that appear in the source code of a Web page but are not visibleto a Web browser. According to Sullivan, metatags should be used as a magnifyingglass to help the search engine focus on the most important parts of apage.

But the use of metatagsis mildly controversial. Stuffing metatags with spurious keywords and descriptionsis a favorite tactic of spammers, and many search engines either ignoreor downplay metatags in favor of other, potentially more reliable indicatorsof Web page content.

Jakob Nielsen, user advocateof the Nielsen Norman Group, blasted the quality and reliability of Internetsearch. "The first conclusion we have from all the usability studies isthat search does not work on the Internet," he said. Nielsen blamed boththe search engines and page designers for the sorry state of Web search.

"People are only going toclick on things that they think solve their problem," he said. All toooften the information returned in a search result is poorly written, orwritten for the wrong medium. "Writing headlines for the Internet is verydifferent than writing headlines for newspapers because they will be takenout of context," he said.

He also blamed the searchengines for including extraneous factors such as numeric relevance rankingsin search results. "Every extra bit of information is pollution; get ridof it," he said.

Panelists voiced unanimousfrustration with the continually changing algorithms search engines useto calculate relevance. "What worked two and a half months ago when thesite got indexed doesn't work today," said D.R. Peck, CEO of Green FlashSystems. Search engine optimization "is a horrible job. There's no certaintyat all. Being number one is like being in love on that first day—it's great,but it's not going to last," he said.

In addition to constantlychanging algorithms, each search engine uses different criteria to determinerelevance. "What you do to attain a top-10 ranking in one search enginecan actually cause your ranking to decline in others," said Fredrick Marckini,CEO of ResponseDirect.com. In a fascinating panel with the deceptivelyinnocuous title "Doorway Pages," three experts discussed techniques designedto beat search engines at their own game, through methods that essentiallyuse sophisticated bait and switch tactics.

Doorway pages and IP deliveryare two such techniques. Doorway pages are short, tightly focused pagesdesigned to achieve top ranking for specific keywords on specific searchengines. A doorway page serves no purpose other than to achieve a high-rankingsearch result that entices a searcher to click through, often to be redirectedto an entirely different site.

A high-tech variation onthis strategy is called "IP delivery." This technique uses software towatch for the arrival of search engine indexing spiders and then dynamicallyfeed them tailored pages for indexing (spiders can be identified by theirunique IP—or Internet Protocol—addresses). Though the text that's retrievedand indexed by the search engine is designed to score highly, the actualtext appearing on a page that the user ultimately sees can be entirelydifferent. If you've ever clicked through on a search result where thepage title and description doesn't match what you end up viewing, there'sa good chance you've just experienced the effects of IP delivery.
 

Humans Gaining on Spiders
After an intensive dayof learning how to optimize pages and outwit search engines and directories,it was refreshing to attend panels featuring representatives from the majorsearch services. A very clear trend emerged: Human-compiled directoriesare muscling aside spider-compiled search engine indexes. Almost all ofthe major search engines now offer results served from a directory, oftendownplaying the results from their own spidered indexes.

Clearly, the Open DirectoryProject (ODP) has become the most important directory in recent months.Though Yahoo! still draws more traffic than any other individual directory,ODP data is now being served by AOL Search, HotBot, Lycos, AltaVista, andNetscape, among others, giving it an impressive cumulative reach that nearlyrivals Yahoo!. And the ODP, using volunteer editors (currently numberingmore than 20,000), is better equipped to scale with the growth of the Webthan Yahoo!, which only employs a staff of a few hundred.

ODP data is freely availableto any search service that wishes to use it. However, every ODP-servedpage has a link inviting users to become an editor, highly effective freemarketing that contributes to the growth of the directory. According toChris Tolles, senior marketing manager for the ODP: "Big sites contributetraffic, editors, and sites. Everybody who uses the directory has to contribute."

The ODP's growth statisticsare impressive. With over 100 sites submitted per hour, the directory isgrowing at about 2 percent compounded per week. Tolles isn't modest aboutthe importance of the directory he helped found and nurture: "We thinkthe taxonomy is going to become a standard for classification on the Web."
 

Growing While Under Siege
Perhaps it was the natureof the audience, but the representatives of the major search engines featuredon the final panel of the conference seemed decidedly defensive. Whilethe marketing departments of these companies are spending millions of dollarsto boast about the size of their indexes, the technologists featured onthe final panel were guarded, seemingly more concerned with efforts toprotect the integrity of the search indexes from even a hint of spam.

"Infoseek has traditionallyhad an open door policy to submission. Unfortunately, this has led to alot of abuse," said Jan Pedersen, director of search and spidering forInfoseek. He said that Infoseek recently discovered a site that managedto sneak more than 2 million pages into its index. Spamdexing of this magnitudehas led to a new submission policy: Infoseek is accepting only a singleURL per site now, though Pedersen said Infoseek is also stepping up itsefforts to spider other pages within a site.

Andre Broder, AltaVista'schief technology officer, offered another alarming statistic, saying thatroughly 95 percent of pages submitted to AltaVista's "add URL" page arespam and are eliminated. "What we're trying to do is provide relevant results,"he said. A noble goal, to be sure, but if AltaVista is rejecting 95 percentof the URL submissions it receives one has to wonder how it's possiblygoing to achieve its other goal of having a truly comprehensive index ofthe Web.

Google's COO Sergei Brinhad a refreshingly different attitude toward spam. "We don't believe inspam, so there's no mechanism for removing pages from the index," he said.Google's belief has two tenets—first, that if a search engine is trulydoing its job, and efficiently analyzing Web page content, bogus siteswill simply never rise to the top of search results. Second, one person'sspam is another's gold, so it's unfair and counterproductive to imposea form of censorship on pages that, no matter how obscure or potentiallyoffensive to some, might be exactly what others are searching for.
 

The Great Game Goes On
Search Engine Strategies99 was the first of a planned series of conferences bringing together Webcontent providers, search engine optimization specialists, and representativesfrom the major Web search services. The dialogue between these key playersin the Web's Great Game should go a long way toward improving the stateof Web search, even as the Web's chaotic growth continues apace. In thewords of Google's Brin, "In the future, search engines should be as usefulas HAL in the movie 2001, A Space Odyssey—but hopefully they won'tkill people."
 
 


Chris Sherman is president of Searchwise, a Boulder, Colorado-based Web consulting firm, and associate editor of Search Engine Watch.

Email Chris Sherman
Comments Add A Comment

              Back to top