Friday, October 29, 2004

SEOchat Frequently Asked Questions

SEOchat Frequently Asked Questions: "a comprehensive frequently asked question forum for common questions relating to search engine optimization. This forum will provide a very good resource for you to read up on the facts about SEO before asking on the forums. New to SEO? Start here by learning the basics and more advanced topics "

Google

Web Feeds, Blogs & Search Engines

Web Feeds, Blogs & Search Engines: "A special report from the Search Engine Strategies conference, August 2-5, 2004, San Jose, CA.
This session explored how search engines are dealing with blogs and Web feed (RSS/Atom) content, and how providing such syndicated content can drive new search-related traffic...

Jeremy Zawodny of Yahoo! "It is changing the way that people get their information from search engines; it is aggregating information. "

User power..."By removing the need to visit a specific web site regularly, blogs have increased their traffic. Most importantly, unlike email newsletters, the power to subscribe and unsubscribe lies solely with the visitor, rather than a service that may--or may not--unsubscribe you from an email subscription when you request it."

"Why should search engine marketers care about blogs? "Because they have a different relationship between the user and the content," said Watlington. "If you think about pages sitting in an index, you are waiting for the search engine to come and query your data. On the other hand, because of the feed's relationship, the user is right there getting the data almost as fast as you create it."

"It is an active relationship (blog) vs. a passive relationship," she continued. "Blogs provide faster access (to data) to an informed and interested audience."



Google

Yahoo! Search blog: Jerry's Take On What's Next in Search

Yahoo! Search blog: Jerry's Take On What's Next in Search Search as a problem is still far from being solved...we have to "make search more relevant and personal."

Jeremy Zawodny (Tech Yahoo) blogged "Those two things are the natural progression for search and they are tightly connected to our concept of seamless integration. Search has to reach a higher bar: it has to enhance the user's life on a daily basis. Integration of search, community, personalization and content builds the foundation for relevancy in people's lives... RSS is allowing people to access exactly what they want and wireless is letting us deliver the information wherever you are. People aren't chained to their PCs anymore and neither is search. ..

Google

Yahoo! Mobile

Yahoo! Mobile Yahoo! Local is perfect for finding restaurants, cool places to hang out and businesses near you.

Google

Thursday, October 28, 2004

Caribbean forums: Hotels.com - Beware. Don't trust your booking to hotels.com

Caribbean forums: Hotels.com - Beware. Don't trust your booking to hotels.com: "Hotels.com - Beware. Don't trust your booking to hotels.com
<< Return to the Caribbean forum home page

Post removed

-:- Message from TripAdvisor staff -:-
TripAdvisor staff removed this post either because the author requested it, or because it did not meet TripAdvisor's forum guidelines.
We remove posts that do not follow our posting guidelines, and we reserve the right to remove any post for any reason.
Removed on: 5:05 pm, Oct 05, 2004 (16149"

Taormina forum: hotels.com - don't trust your booking to them: "hotels.com - don't trust your booking to them
<< Return to the Taormina forum home page

DJM
Joined: Sep 2004
Posts: 5
London, EnglandPosted on: 11:41 am, Oct 05, 2004Report inappropriate post

I am still in shock. I booked a hotel through tripadvisor's link to hotels.com. I even had one of their customer services agents online while I did it [first time through I was kicked out of hotels.com for no reason]. The booking went through smoothly and was accepted. The agent confirmed that the booking had been accepted and that I should receive confirmation within a few days. One WEEK later, I called hotels.com asking where my comfirmation letter was as I needed to make some special requests of the hotel. Hotels.com had record of my attempt to book a room but their records show that the booking had been declined. I had received no communication from hotels.com and certainly nothing in the booking process would have suggested that my booking had been declined. I am now in the position of having no hotel in Taormino over the busy upcoming half term. HOTEL USERS BEWARE...... Don't trust your booking to hotels.com. They will leave you stranded without help or apologies. AWFUL."

Google

WebProWorld :: LIST: High Ranking Directories and Indices

WebProWorld :: LIST: High Ranking Directories and Indices

Google

KLOTH.NET - Trap bad bots in a bot trap

KLOTH.NET - Trap bad bots in a bot trap: "Block spam bots and other bad bots from accessing and scanning your web site"...

KLOTH.NET - List of Bad Bots: "A short list of bad spiders and nasty bots seen on kloth.net and kloth.org.
Most of them have been found ignoring the www.robotstxt.org/wc/exclusion.html robots.txt standard and running straight into a bot trap, others have been found harvesting mail addresses for sending spam"

Google

log analyser with fuller details for SEO Statistics for awstats.sourceforge.net (2004-10)

Statistics for awstats.sourceforge.net (2004-10)

Google

Yahoo Research urls: Adds "Beta" Search Tools & Enhancements

Yahoo Adds "Beta" Search Tools & Enhancements:
"http://next.yahoo.com/ Welcome to Yahoo! Next

Yahoo! Next is where you'll find some of the ideas and products that we're working on. Your comments will help us determine what remains just an idea versus what becomes the next big thing. So please play around with the prototypes and send your thoughts directly to the Yahoos working on them. Be a part of our team...Find more concepts from Yahoo! at
research.yahoo.com"

Google

Wednesday, October 27, 2004

Yahoo battles Google for the cell phone | CNET News.com

Yahoo battles Google for the cell phone | CNET News.com: "Yahoo added a search feature for cell phones Wednesday, just a few weeks after rival Google launched one of its own.

Google SMS (Short Message Service) uses text-only messages to deliver its results, Yahoo's new mobile service offers localized search results, maps and Web site icons that let people point, click and make a call...

Yahoo Chief Operating Officer Dan Rosensweig said the mobile Internet industry is at a "tipping point." As mobile use continues to grow, he said, more customers will want access to their Yahoo services."

Google

Google Drives 70 percent of Traffic to Most Web Sites: Case Study on well optimised sites

Google Drives 70 percent of Traffic to Most Web Sites: "On my own sites and those of clients that I reviewed, Google sends over 70% of all search traffic to every one of those domains in every case. This includes Google foreign variants, Google Directory and Google image search (image search numbers are tiny)...

Average referred search engine traffic to those client sites reviewed for this article.

Google 74%
Yahoo 14%
MSN 9%
Ask 2%
All other SE's 1%

Mike Banks Valentine concludes "I believe these search engine traffic percentages are a direct reflection of relevance delivered by those search engines." Copyright © 2004 Mike Valentine

Google

SEM Survey MediaDailyNews 10-27-04

MediaDailyNews 10-27-04: "Only one-quarter of active search marketers can be classified as sophisticated, according to JupiterResearch's second semiannual search engine marketing survey.."

Key Findings
Nate Elliott, author of the report... said, that approximately 150,000 of the 200,000 marketers utilizing paid search are not bidding, tracking, measuring, or expanding their keyword lists effectively, if at all.

50,000 "sophisticates" are more likely to be large marketers...
33 percent of sophisticates have total marketing budgets that exceed $1 million
Sophisticates ... buy more keywords and advertise on more search engines
39% sophisticates buy 1,000 or more keywords per month, compared to 14% of unsophisticates
sophisticates use an average of 3.8 search engines for search engine marketing campaigns, compared to 2.7 for unsophisticates
sophisticates are more likely to hire search engine marketing firms than unsophisticates due to the manpower and resources required to maintain large, expanding search engine marketing campaigns.

For unsophisticates, Elliott concludes "the most basic and most important thing" for them to do is to track and measure their search engine marketing campaigns.

Google

The Best SEO Info & Forum Roundup Blogs

The Best SEO Info & Forum Roundup Blogs: "Best SEO Info & Forum Roundup Blogs"

Google

Biggest Search Engine Announcement in SEO History

Interesting, mildly amusing at times and maybe a few nuggets along the way in this thread.....Biggest Search Engine Announcement in SEO History: "In your mind, and joking aside please, if a search engine was to announce something and precede this announcement with the tag line 'The biggest announcement in search history for an SEO', what would you think they are talking about.

What in your mind can a search engine announce to the public and have such a drastic impact on the SEO community?"

Google

Yahoo! Search blog: Search Tricks #2: News Search: Yahoo RSS Launch

Yahoo! Search blog: Search Tricks #2: News Search: "public launch of RSS on Yahoo! News Search "

One comment states "our original news is already picked up by Google News and Moreover, but I've never had any luck figuring out how to invite Yahoo News to grab our content, too."...The answer should be interesting (If Yahoo give one that is....)

Google

Forum discussing Search Engine Marketing For Travel-Related Sites article: Search Marketing in the Unfriendly Skies

Search Marketing in the Unfriendly Skies: "Search Marketing in the Unfriendly Skies "

Google

Search Engine Marketing For Travel-Related Sites

Search Engine Marketing For Travel-Related Sites: "Search Engine Marketing For Travel-Related Sites"

At this session of the Search Engine Strategies conference, expert panelists discussed issues and strategies for the travel industry...

Main stats:
Jupiter Research found "for each dollar generated online, travel companies recorded an additional $5 revenue from traditional channels as a direct result of research that consumers did online."

Cornell study that "by 2005, 20% of all hotel bookings will be made online: half directly with hotel companies and half from third-party intermediaries"

Study by Phocus Wright states "65% of hotel buyers check three sites before making a buying decision"

John Waddy, President of Travel eMarketing, LLC said "In researching travel, consumers typically know the destinations they are interested in... always provide enough content on your site about the destination for visitors to make a buying decision," he said. "If a site visitor has to go to some other site to find out information on your hotel or event, you're probably going to lose that visitor as a conversion and repeat visitor."

Multi Channel Campaign Recommended:

Jon Schepke, Vice President and Principal at Meandaur...Schepke uses a number of SEM strategies that include PPC advertising, paid inclusion, and natural search engine optimization.

Natural SEO: With plenty of keyword-rich content, search engine optimization can be a cost-effective means of marketing a travel Web site.

Search engine paid strategies pick best tool to track the campaign, the channel, the keyword,the tactic... put a conversion tag on conversion pages...that associates the revenue type, or revenue amount, with the actual campaign and the channel."
Schepke uses the Broad Match on Google and Overture plus search engine advertising at FindWhat, Kanoodle, and Enhance Interactive (formerly Ah-Ha)...specialized comparison search services, such as SideStep (travel) and NexTag (shopping). Also recommends these marketing channels for travel sites:

Hotel or branded sites (such as Hilton.com)
Destination/travel sites (such as Chamberofcommerce.com)
Third-party intermediaries (such as Expedia.com

The self-funding model

Market through a single channel:
Example: ""The client had $30K of seed money. The goal was to generate $200K for every $10K spent."

Reinvesting 7, 10, or 13 percent of revenue was part of the self-funding strategy. "Assuming the same gross revenue and reinvestment percentage, your campaigns can begin to grow monthly," said Pingel. Within one month, revenue jumped from $10K to $26K. ROI can be determined by gross revenue or bottom line profit.

The article ends with "Travel site tips" which mirror the plans for Totaltravel working with Summit Media.

Google

Tuesday, October 26, 2004

Beyond SEO – Search Engine Optimization: eyefortravel.com

eyefortravel.com - Travel Distribution News, Events and Analysis: "Estimates suggest currently only 1-2% conversions happen for any SEO or offline marketing campaigns...

According to Yesawich, Pepperdine & Brown, almost six out of 10 leisure travelers now actively seek the "lowest possible price" for travel services
On average a leisure traveler visits 3-4 websites before making a purchase
Equally important is the Availability factor
One Stop Shop V/s Specialize in Airfares or Hotels - The big brands in online travel have been emphasizing on packaging especially dynamically packaging.
Inclusive rates are easier to navigate, compare and understand.

Bhanu Chopra is the CEO of RateGain. RateGain provides Internet based competitive pricing intelligence to the global travel industry."

Google

J Whalen and SEW on Title Length.....Rank Write Roundtable - Issue No. 026

Rank Write Roundtable - Issue No. 026: "I believe in making all of my tags an appropriate length for what I need to get into them. This varies from page to page and site to site. I can't think of a time when I've EVER counted the characters in any tag I've created. If,
however, you are a numbers fan, from what I understand, most of today's search engines will display 60 to 115 characters of your title tag. Remember, though, this doesn't mean that if you make it shorter or longer than those amounts, you'll be penalized. The same thing goes for the other Meta tags. Generally, most engines will index approximately 200 characters of the Meta description tag and about 1000 characters of the Meta keyword tag. (By the way, these numbers were obtained from www.searchenginewatch.com.)"

High Rankings All about the Title Tag for Search Engine Optimization

Search Engine Watch Search Engine Placement Tips: "titles"

Google

Yahoo! Search blog: An Interview with Paulien Strijland of Yahoo! User Experience Design

Yahoo! Search blog: An Interview with Paulien Strijland of Yahoo! User Experience Design: "Interview with Paulien Strijland of Yahoo! User Experience Design"

Q: How is Yahoo!’s philosophy on site design and page layout different form others? Why do our pages seem more complex than others?

A: I actually think many of our new products are quite clean and beautiful. For example our new Local Search product. But it all depends on the purpose of the page or the product. You have to compare apples to apples.

Q: Okay. How about Google’s front page compared to Yahoo!’s front page?

A: Well again, the purposes of both are very different. Besides search, people come to Yahoo!’s front page to do everything from getting driving directions to finding stock prices to sending email. You have to figure a way to elegantly include all the things that people are trying to access on one page. Aesthetically, we have a very different challenge from sites like Google that essentially provide variations on one main product. With the creation of our new front page (now in beta) as well as other key pages on the site, you’ll see that we’re putting even more attention into balancing content with aesthetic

Google

Forecast No of links gained by a full time links campaignToo Many Links Too Fast -> High Rankings Search Engine Optimization Forum

Too Many Links Too Fast -> High Rankings Search Engine Optimization Forum:

Googlewhacked Posted: Oct 19 2004, 02:00 PM "'Let say that you have a person who works on obtaining directory submissions & links from relevant, related websites all day long. Assuming that (s)he manages to find & submit the site (s)he represents to 10 relevant sites per day (a realistic number, IMHO), and that an average of 70% of those result in links back to the site being promoted, that translates to:

7 sites / day
35 sites / week
140 sites / month (going with a 4-week month)
1820 sites / year (52 weeks)'

...there are link buying programs out there that will give you a text link from every page of their 12000 page site, but the vast majority of those links are devalued, so the link pop. benefit is negligible."

powerofeyes Posted: Oct 19 2004, 02:36 PM "our recent tests shows yahoo puts a lot more weight on anchor text from unique established sites than google or any other search engine"

Google

Yahoo, Adobe team on search | CNET News.com

Yahoo, Adobe team on search | CNET News.com: "Yahoo and Adobe Systems announced a partnership Monday to combine Adobe's PDF format with Yahoo's Internet services...

Features may include:
"online service to convert Web content into PDF (Portable Document Format) files...Create Adobe PDF Online
a toolbar that would add access to Yahoo Search and other features to Adobe Reader.. pop-up ad blocking "

Google

New Mobile Phone Search Services

A New Mobile Phone Search Service: "UpSnap is the brainchild of Tony Philipp, who built Lycos Europe and later worked with Vivisimo. Partners in the venture include web veterans Richard Jones from Fortunecity and Wendell Brown from Evoice...either from your SMS-enabled cell phone or via UpSnap.com. Simply type the name and location of a business you're trying to find and UpSnap returns directory information. Your phone does not need a browser or the ability to connect to the internet to use the service. The service is free and is supported by sponsored listingsPhilipp says that future services will include white pages, shopping comparison, search and tracking of eBbay auctions and other services...

The search via mobile phone space is suddenly heating up, with the launch of a number of new services this year. News Editor Gary Price will be taking a closer look at two promising services, Mobot and Evernote, in an upcoming issue of SearchDay."

Google

Monday, October 25, 2004

SERPs report

A quick check of serps this morning show the UK at 17 in Yahoo for latest British hotel deals, but not in Google. The 3 phrases monitored in Google have risen in the rankings however. Pages included by MSN have risen to 909 from 207 and most oddly pages in Altavista are up to 81 from 5.

Google

Yahoo! Search latest version

Yahoo! Search

Google

SiteProNews: Search Engines and The Meta Description Tag J Whalen

SiteProNews: Search Engines and The Meta Description Tag Which engines use for ranking and how to control text in SERPs

Google

Saturday, October 23, 2004

BIG changes in Yahoo SERPS - more sites dropped

BIG changes in the SERPS thread to watch...

Google

Usability, SEO and Web Design: Kim Krause blog

Usability, SEO and Web Design: "about usability, seo, web dev, search engines, and Internet stuff"

Google

Search Marketing UK Trading Association - UK new organisation

Further professionalisation Search Marketing UK Trading Association: "Search Marketing Association UK (SMA UK) is a new organisation designed to promote Search Engine Marketing across the UK."

Google

Friday, October 22, 2004

Retaining Traffic after a Web Site Redesign

Retaining Traffic after a Web Site Redesign

User centered redesign process based on user data, planning ahead means site owners can minimize the effects of a site redesign on traffic and conversions

Google

MediaPost Advertising & Media Directory

MediaPost Advertising & Media Directory If you use search to present your product or service, you should start looking at mobile search. It is going to expand and change the search industry in ways we haven't even contemplated yet...

Google

Forbes.com: Yahoo Acquires Another E-Mail Startup

Forbes.com: Yahoo Acquires Another E-Mail Startup:

"Bloomba was launched last year by San Mateo, Calif.-based Stata Labs. Many analysts praised it as being more nimble and elegant than Microsoft's Outlook, but doubted Bloomba could survive without a larger patron...

The other e-mail company acquired by Yahoo this year, Oddpost, made a Web-based product radically different in approach from Bloomba but equally praised for simplifying the Internet's most popular application."

Google

Thursday, October 21, 2004

New UK search engine & directory About UKWizz

About UKWizz

Google

The Keyword Tools Trap: Op for relevence not high traffic phrases

The Keyword Tools Trap: "why I had suggested several phrases to them that showed 0 searches in Wordtracker. The reason? Those phrases were showing REAL referrals in their logs for several variations. People were actually using those phrases to search, and although they found my client's site, it wasn't doing a good job of focusing on these relevant searched-for terms. "

Google

Wednesday, October 20, 2004

LookSmart names new CEO | CNET News.com

LookSmart names new CEO | CNET News.com: "LookSmart names new CEO" looking to expand...

Google

More re AOL, Google expand alliance in Europe | CNET News.com

AOL, Google expand alliance in Europe | CNET News.com: "Google has already been providing Web search results for all AOL search-related products in Europe. Google AdWords advertisers will now appear on all search results on AOL's European Web sites, the companies said. "

How does this fit with AOL being in talks with Kayak?

Google

RSS 101 SiteProNews: Article Syndication - A New Vehicle for King Content

SiteProNews: Article Syndication - A New Vehicle for King Content: "It's All About Content Syndication"

Google

Tuesday, October 19, 2004

Cre8asite forums. Website Design - Would the world be a better place without Jakob Nielsen?. [ Search Engine Optimization, Usability and Web Design. ]

Cre8asite forums. Website Design - Would the world be a better place without Jakob Nielsen?. [ Search Engine Optimization, Usability and Web Design. ]: First re-read the "great Jakob backlash " then this newer discussion....

Google

Synergy in Search Engine Marketing

Synergy in Search Engine Marketing: "Combining SEO and PPC offers an overall stronger search engine marketing campaign and encourages a well thought out Web site with more relevant content, pleasing graphics, and ultimately higher visibility among the major search engines...

Overall, the field of search engine marketing is enhanced when the key processes of SEO and PPC work hand-in-hand to create a stronger form of search engine marketing "

Google

Local Search a short look at Yahoo Local and Google Local : Internet Search Engine Articles

Local Search a short look at Yahoo Local and Google Local : Internet Search Engine Articles: "Listings are added to Yahoo's local search by clicking on the link marked, �Add/Edit a Business�. This link opens a fairly straight forward form asking for basic information about a business, including the nature of the submitter's relationship to the business. Right now, it appears that submission to the local search database is free...

Businesses wanting to be listed in Google-local are asked to submit their information via email but before they do, they should check to see if they are already in the index. Google-local gets its listings from a number of sources including local Yellow Pages and telephone directories. An interesting feature of the culture at Google is their willingness to help businesses update any out-of-date information carried in print directories they get listings from. For example, if your business moved locations after the most recent telephone directory was published, the information in that directory would be incorrect, as would your listing at Google-local. Google invites businesses to email them any contact information changes (local-listings@google.com) and they will not only update you listing at Google-local, they will also pass the new information to the source that provided them contact info from your area"

Google

Cloaking 101 - Questions and Answers Fantomaster

Cloaking 101 - Questions and Answers: "Fantomaster, our resident 'industrial strength cloaker'"

Google

Saturday, October 16, 2004

WebProWorld :: Yahoo RSS Plan Yields Additional Advertising Space

WebProWorld :: Yahoo RSS Plan Yields Additional Advertising Space: "Yahoo has announced that they are planning on offering an advertising service that will allow for sponsored links to be placed within syndication feeds. "

Links to RSS info..

Google

Friday, October 15, 2004

Google Desktop Search Download

Google Desktop Search Download

O'Reilly Network: Google Your Desktop review at O'Reilly

Google Launches Desktop Search Tool Slashdot discussion

Google

Thursday, October 14, 2004

Sabre releases internet-based booking tool for agents - 14-Oct-04.

Sabre releases internet-based booking tool for agents - 14-Oct-04.: "MySabre is a combination of technology from Sabre Holdings and Yahoo! - and internet-based tool for agents that combines traditional bookable service such as airlines, car hire, hotels, cruise and rail, with other content from the internet and third-party suppliers."

Google

SEW Forum; Cloaking Questions: Yahoo redirects problem surfaces again

"Industrial Strength White Hat" Cloaking Questions: fantomaster writes "Mixing cloaked and non-cloaked pages on the same domain (and IP, for that matter), while technically possible, is not recommended in view of the engines' declared anti-cloaking policy."

Google

Algo Update? Don't Panic keep to the basics

Algo Update? Don't Panic:These SEO "tips are basic common sense and are based on Google's current behavior."

Google

Wednesday, October 13, 2004

WebProWorld :: Do Scripting Errors Affect Site Ranking In SERPs.

WebProWorld :: Do Scripting Errors Affect Site Ranking In SERPs.:

Hal posts: "some real problems I have with a site not meeting W3C validatation. I've researched two such sites in recent weeks. "

H concludes: "The cleaner the code, the higher in the page your valuable key phrases will appear. And it is that which can have a direct impact on search engine placement.

Google

re MSN new search technolgy: forum and links on Block Analysis 101

Block Analysis 101: Tech review "explains in simple terms the block level algo"

See especially the comments by danny sullivan, (DS) Editor, SearchEngineWatch.com: He puts the theory that "Google's "named entities" might be a similar thing, but not tied to links. This is something they talked about recently: Google Demos Word Clustering. It sounded to me like they were, by analyzing language rather than visual cues, trying to understand what the core content of a page is."

Orion recommends: an early paper "about Microsoft's VIPS (VIsion-based Page Segmentation) "ImageSeer: Clustering and Searching WWW Images Using Link and Page Layout Analysis"

Current conclusions are that "block analysis" will make crawling faster, standard nav links and paid ads may be devalued in algo - DS puts it as if links are demed " "unnatural" or not part of the core content -- using both visual and language cues -- I think they'll be discounted. Not banned, not ignored -- just not weighted as highly."

One work around is to use CSS to put links where you want them in code regardless of where on page...



Google

Search Engine Spider Simulator is back...

Search Engine Spider Simulator

Google

Forbes.com: Update 1: Yahoo's 3 Quarter Profit Nearly Quadruples

Forbes.com: Update 1: Yahoo's 3Q Profit Nearly Quadruples: Y"earned $253.3 million, or 17 cents per share, for the three months ended in September, up from net income of $65.3 million, or 5 cents per share, at the same time last year.
...Revenue for the period totaled $906.7 million, a dramatic increase from $356.8 million last year. Yahoo's revenue fell to $655 million after subtracting commission paid to some of its advertising partners"

Google

Tuesday, October 12, 2004

More on Snap...New Snap site thinks outside the search box | Tech News on ZDNet

New Snap site thinks outside the search box | Tech News on ZDNet:

The Snap">Snap index is powered with data from Gigablast,LookSmart, DMOZ, Looksmart and anonymous data feeds from Internet service providers.

Snap Rank uses traditional search engine methods plus post-query click-stream data to determine ranking.

Snap.com: "The main difference is the tools it uses to help people refine search once they have typed a query into the search box.

It's also one of the first search engines to harness data on "user intentions," extrapolating meaning from words typed into a search box.

"We can infer what people have done (in the past with the same search) and then change the actual layout of the page and sort order," said Tom McGovern, CEO of Perfect Market Technologies, which owns Snap."

Snap uses data feeds to add releveancy to their SERPs

1) Over one terabyte of data from third-party ISPs which track, anonymously, what people do after they've typed in a specific search terms, this data is used to compute the relevancy of certain searches and sort results accordingly

2) Data on the routes people take after they leave the search engine, which helps reorder future results by relevancy

3) Most revolutionary is that the service is transparent to advertisers and visitors...

4) Snap will make money by selling advertising placements at the top of search results, but the twist is that it will let marketers pay specifically for people who buy at their site as a result of the Snap listing, a "cost per transaction" model...

5) Advertisers can set the cost per transaction: "for example, that they want to pay 25 percent of their product cost or $4 for every widget they sell if a consumer buys it from the ad at Snap--and that information will be displayed in Snap's product listings. In comes the transparency."

Also :

Product Finder tool - above web results, from PPC partners? Looksmart, www.smarter.com (comparison shopping engine), Dmoz and Gigablast

Google

Net Stocks: A trade for this season: Buy Yahoo, short Google? - Media - General

Net Stocks: A trade for this season: Buy Yahoo, short Google? - Media - General: "an intriguing trading call made news on Monday. Buy Yahoo, and short Google...

Mahaney is far from being Wall Street's only Yahoo bull. Even if its search business isn't as robust as Google's -- and that's a big if -- Yahoo still has a branded-ad business that has yet to truly take off.

Lanny Baker predicts upbeat buzz from Yahoo's quarterly report. Baker noted that investors will likely be surprised by Yahoo's branded advertising results as he's heard that the fourth-quarter advertising inventory is nearly sold out and Yahoo has been able to charge higher ad prices."

Google

Yahoo appeal..sitematch

Yahoo appeal: "Yahoo on sitematch: As part of
the SiteMatch program, every submitted URL is reviewed. For a URL that has been previously found to have violated our Content Policy Guidelines, this review may erase that (or may confirm it).

Review of an URL is also available for free by emailing ystfeedback@yahoo.com (we'll also tell a site if they have violated the Yahoo! guidelines), so you don't have to pay for SiteMatch to get a site review. "

Google

Monday, October 11, 2004

Purchase tickets - Internet Marketing Masterclass

Purchase tickets - Internet Marketing Masterclass

Registration £175.00 plus VAT
Book now online before October 20th and get a - £25.00 - early-bird discount!

***EARLY-BIRD DISCOUNT PRICE ONLY £150.00 plus VAT!***
Includes networking lunchtime buffet and morning and afternoon refreshments.

Google

Filthy Linking Rich - forum discussion

Filthy Linking Rich: "the purpose of good search results is not the same is democracy anyway. It's not abaout being fair, but being fast and accurate and 'good'. "

Google

SERPS

No change at YAhoo, no feedback from AU

Google

On PageRank wise words by Richard Grady SiteProNews: Google Page Rank - Important or Just Another Number?

SiteProNews: Google Page Rank - Important or Just Another Number?: "So, in summary, is Google Page Rank important to your business?

Only as an "indicator of how many other sites link to yours and how important Google considers your site to be"

He does not "place too much importance on this statistic" and would not pay " for a link from a website just because it has a high PR."

He states "Google changes it's rules on a regular basis and I see little point in chasing a particular PR on the basis that it might get you higher search engine rankings. If Google does decide to do away with PR, all your work (and money if pay fpr links) will have been for nothing.

A more ethical and logical campaign would " concentrate on building quality, relevant links from sites that are connected in some way to your own site content."
This yields relevent traffic and increase in PR over time
"If you do things this way and Google does scrap the PR indicator, it shouldn't affect you in any way and the links you have in place will continue to benefit you."

NB Neither a low Alexa rating or high PR guarantee traffic or sales which are the true aim.

Google

Sunday, October 10, 2004

Filthy Linking Rich by Mike Grehan TBC

Bias caused by dependence on linking...Filthy Linking Rich by Mike Grehan: "Are search engines giving a fair representation of what's actually available on the web? Not really. If pages were judged on the quality and the relevance for ranking, then there would be less search engine bias towards pages which are simply popular by 'linkage voting'. Unfortunately, quality is subjective so finding a universally acceptable measurement or metric is not going to be easy"

SEs apply "random graph theory to the web, they have viewed it as a type of static, equilibrium network with a classic Poisson type distribution of connections." but net is not static...

network theory throws light on a number of social mechanisms which operate beyond the world wide web to structure it...

Lada Adamic, of Xerox, Palo Alto Research Centre... discovered that, just as in the social sphere, one could pick two sites at random and get from one to the other within four clicks...

one of the primary features of a random graph is that its degree distribution always has a particular mathematical form known as Poisson distribution...

physicist Albert-Lazlo Barabasi ...has shown that many networks in the real world have degree distributions that don't look anything like a Poisson distribution. Instead, they follow what is known as a power law...relates to links and nodes. ...He has discovered that all networks have a deep underlying order and operate according to simple but powerful rules...

Clustering ...is an almost universal feature, not just in social networks, but of networks in general...Perhaps the greatest discovery of the laws of network organisation focuses on the idea of "hubs" and how they form. These are the centrepieces of networks, around which many links form...


Hyperlink based "popularity" algorithms (esp PR) are "inherently biased against new and unknown pages.

It is essential to be aware that "" the importance or quality of page" is distinct to "the relevance of page" to a user query.

Relevence is query specific...

He quotes an experiment which found that "the top 20% of the pages with the highest number of incoming links obtained 70% of the new links after seven months, while the bottom 60% of the pages obtained virtually no incoming links at all during that period."

Ends with the teaser that "A new model has been developed which can be used to predict and analyse competition and diversity in different communities on the web."

This has to be clustering ( See Teoma)

References:
"rich get richer" problem at Google: Impact Of Search Engines On Page Popularity.


A New Paradigm For Ranking Web Pages On The World Wide Web. http://www2003.org/cdrom/papers/refereed/p042/paper42_html/p42-tomlin.htm

On network science he recommends: Evolution of Networks



Google

PhysOrg: Going from a 'Web of links' to a 'Web of meaning' (Again...)

PhysOrg: Going from a 'Web of links' to a 'Web of meaning': "Heflin recently received a five-year, $500,000 CAREER Award from the National Science Foundation to study distributed ontologies that could bring the Semantic Web closer to reality....

Researchers in artificial intelligence have proposed to make ontologies explicit, says Heflin. In computer science, an ontology encodes knowledge about the world, and can thus determine what is implied and find answers without explicit instructions. Ontologies can be used by people, and by databases and other applications that need to share information about domains, or specific subject or knowledge areas, such as cars, medicine or real estate...

Heflin wants to look at ways of partitioning the Web into useful subsets so users can determine which ontology to use when they have a query and can find an ontology that will point them to the web page that is most suited to the perspective of their search.

"I want to develop an underlying theory so we can understand and build a system that can handle large amounts of data," says Heflin. "That system should be able to look at medicine from the point of view of a patient, a doctor or a pharmaceutical manufacturer, or to search universities from a professor's or a student's point of view."

Google

Branding strengthens on search Yahoo's Q3 expected to be upbeat

Yahoo's Q3 expected to be upbeat: "analysts, like many who follow Yahoo, expect the advertising businesses on the Web -- paid search and branded -- to have been stronger than originally expected...

Key points: "Goldman's Noto raised his branded-ad figures more than paid search...

Yahoo's user base.. grew 12.5 percent annually versus 7.7 percent for the industry...

Yahoo's search page views also outperformed the industry"

Google

Saturday, October 09, 2004

Exclusive Demonstration of Clustering from Google : Search Engine Lowdown: Web 2.0 -

Search Engine News :: Search Engine Lowdown: Web 2.0 - Exclusive Demonstration of Clustering from Google: "Peter Norvig, Ph.D., Director of Search Quality for Google ... revealed that Google had been working on three different tasks to better understand the web.

1. Statistical machine translation
2. Named entities
3. Word clusters"

Google Demos Word Clustering: "Google Demos Word Clustering...

By understanding clusters of search results, it may be easier for Google (and other search engines) to determine pages that don't seem to belong somehow on a particular topic -- in particular, spam pages that given their often artificial nature might stand out more."

Google

WebProWorld :: Reciprocal links to be penalised?

WebProWorld :: Reciprocal links to be penalised?: "Reciprocal links to be penalised?"

Google

Thursday, October 07, 2004

AOL Branding Campaign

America Online to Break Branding Effort: "America Online Thursday will kick off a brand campaign aimed at establishing itself as an advocate for consumers online. The company will also unveil a new logo,"

Google

New search engine Snap search: eg noahs bondi beach

Snap search: noahs bondi beach Snap is brought to you by Idealab, a creator and operator of technology companies.

Google

WebProWorld :: New MSN Search Preview

WebProWorld :: New MSN Search Preview: "Gates is supposed to be a guest speaker at the upcoming SES Conference in Chicago mid-December. Maybe then? Ronnie T. Dodger "

Sppof of Google watch at MSN Watch a creation by Aaron Wall

Web Development News: MSN Search Tech Preview - Around the Horn: "MSN Search Tech Preview"

to repeat what Nacho said at SearchEngineWatch forums:
Keyword-in-domain continues being strong, but not as much as Round 1.

Cache is as fresh as your grocery store around the corner.

Improved navigation and nicer looks.

The index seems to have been growing nicely.

link:www.domain.com is working well too.

Big authority websites carry weight in ranking. Seems like mom & pop will continue to struggle.

On page factors and content weigh in heavily.

Results for other languages are not coming up well (I tried Spanish).

And my response:
This could be true. I think the answer is as previously noted about anchor text and titles.

Damn ... I knew I forgot to look at that! That wasn't there in the first preview.

My feeling is that this is a temporary shell. They will stick with the current MSN Search layout (or another one that we have not even seen).

Oh hell yes! It has grown very well. They are on a tear.

I found the link: showed less than domain:, plus you had to add a qualifying word to trick the query into revealing your pages.

I am not so sure about that. I can see a good mix here. In some cases, mom and pop or startup sites are doing quite well. But this could be subjective too. Everyone will have a different interpretation of that.

Umm ... I think it is anchor text and titles. But that is a debate, I hate to debate. Let's just say you are correct! ;-)

Tomaré su palabra en eso.

Google

"Search is a Platform. Where is it Going?" Search Engine News :: Search Engine Lowdown: Web 2.0 - Search Engine Execs Chat

Search Engine News :: Search Engine Lowdown: Web 2.0 - Search Engine Execs Chat: "Summary of panel discussion:
"Search is a Platform. Where is it Going?" No Google, but the panel was made up of some of the top people in the search industry:

Steve Berkowitz - Ask Jeeves
Udi Manber - Amazon's A9
Louis Monier - eBay
Christopher Payne - MSN
Jeff Weiner - Yahoo


Weiner did say that 'linking relevancy is flawed as the sites that sit at the top of the search results, tend to get linked-to the most and end up staying at the top.'"

concludes: "No major announcements, no real shocks. However, it is clear that search as we know it will change over the coming years. All of the panel seemed committed to personalization with many favoring a desktop solution too!

Google

Yahoo! Next - research, demos, papers

Yahoo! Next

Google

Yahoo's advice on using the meta-keyword tag - JimWorld Gazette - Issue #213 - October 6, 2004

Gmail - JimWorld Gazette - Issue #213 - October 6, 2004: "http://help.yahoo.com/help/us/ysearch/ranking/ranking-02.html
Yahoo's advice on using the meta-keyword tag:"...

Concludes: Yahoo! "search on "sedands" (no quotes, of course) and see what page comes up in the number one position. Try the same search on msn.com and on teoma.com (who claims they don't even index the keywords meta tags).

Seems to me the keyword meta tag has a bit of life left in it yet. :-) "

Google

Being 'Web Standards' Compliant JimWorld Gazette - Issue #213 - October 6, 2004

Gmail - JimWorld Gazette - Issue #213 - October 6, 2004: "WEB STANDARDS: ARE YOU READY YET?
Guest Article by Diane Vigil

As some of you know, the term 'Web Standards' does not mean just
building websites to comply with standards adopted by the World
Wide Web Consortium (W3c.org), nor even that a web page's code is
certified to be valid by the W3C's validation tools. Rather, 'Web
Standards' refers to modern coding that makes websites accessible
to a variety of browsers and, as they say, 'devices' ... making
it unnecessary to create, maintain and pay for multiple versions
of a website to support 'devices' now in existence or yet to be
conceived, designed and marketed. (More at
http://webstandards.org/about/ )"

Google

Tuesday, October 05, 2004

iMediaConnection: SearchTHIS: DoubleClick on Pay-for-Play

iMediaConnection: SearchTHIS: DoubleClick on Pay-for-Play: "a new search white paper hit the streets today offering a summary collection of guiding principles in pay-for-placement search and previews the DoubleClick Search Trend Report. "

Cooments that "Search is complex. Search is difficult. Search can be unforgiving and costly if you don’t watch it closely -- just a couple of the report's delicate pearls of wisdom delivered with a sledgehammer"

Main focus: 80/20 theorem in pay-for-placement re the number of terms in any one campaign - 62 percent of active keywords (phrase that receives at least one click a month) generate less than ten clicks per month.

Notable conclusions include: more than 82 percent of active keywords are 2, 3 or 4 keywords long, while only 7 percent are 1 word long

Link to download the DoubleClick white paper

Google

Idealab chief stakes out new direction in search | CNET News.com

Idealab chief stakes out new direction in search | CNET News.com: Insider Pages people sign up to connect with friends and mine their recommendations for local shops and services. The free product, still in experimental form for Los Angeles residents only, puts a new spin on social-networking services like Friendster by infusing it with the local insider feel of Craigslist.

Google

� The vs-Switch on Yahoo | JonnyGoodBoy's Weblog

� The vs-Switch on Yahoo | JonnyGoodBoy's Weblog: "If you do a yahoo search and add �vs� to your search you will get the most recent slurped pages, that are not yet published on yahoo."
Hmm cannot get this to work....

Google

Cool re links: Tell Us About Your Web Site!

Tell Us About Your Web Site!: "ResearchBuzz does not participate in link exchanges. If we like the resource we're reviewing, we link to it. A reciprocal link is not required."

Google

ResearchBuzz: News and Information about Search Engines, Databases, and Other Online Information Collections

ResearchBuzz: News and Information about Search Engines, Databases, and Other Online Information Collections:The author, Tara Calishain is a "member of the Yahoo! Search Advisory Council. I got permission to make that public"

Books:
Google Hacks
Web Search Garage

Also recommend two of her articles:
'Four Things Yahoo Can Do that Google Can't'
'Seven Ways to Save Time Searching'

Google

Expired domains solutions : marketing strategies

Hottest trends for Christmas 2004 marketing strategies: "expired domains that used to have high PR (PageRank). Should I just redirect these domains to our main site, make jump pages, or create html pages with lots of relevant information?"

Google will see the redirect and reset PageRank

Put relevant content on page(s) ASAP

Check what incoming links said and use equivilent content

Do not create low value "jump" pages or any other sort of "no content" redirect

Google

Yahoo Adding New Search Engine Tools Forbes.com:

Forbes.com: Yahoo Adding New Search Engine Tools: "Yahoo Adding New Search Engine Tools"
Jumping on the personalisation wagon with function to save and share searches via RSS - very similar format to Ask Jeeves Inc.(teoma) and A9.com, (Amazon.com Inc.)

Named " digital dashboard" it will search users PC for e-mail, documents or info from their "My Web" of search archives...


Beta version:

click link below listing to save
can add notes to the page (listing)
file it in a special folder
set listing to appear in RSS form on their My Yahoo page
unlimited storage of search histories
paid ads from Overture for £££

Google

Monday, October 04, 2004

Yahoo launches local-search engine | CNET News.com

Yahoo launches local-search engine | CNET News.com: "Yahoo announced Monday that it's launched Yahoo Local, part of a move to give Web surfers more exact and more comprehensive local-search results. Previously, the service had been available only in beta form. "

Google

The One-Two Punch: SEO and PPC

The One-Two Punch: SEO and PPC: "The study results clearly indicate SEM must include both paid and natural SEM to reach the entire audience.
Though it appears intuitive, many companies engaged in natural search engine optimization (SEO) resist investing in paid search advertising. And many companies engaged in paid search advertising aren't yet pursuing an SEO strategy...

Gary Stein, senior analyst of online advertising and marketing for Jupiter Research (a Jupitermedia Corp. division). "There are way more clicks that go to the algorithmic than the paid search. Our estimates are five out of seven clicks go to the algorithmic, or natural, results."

In Yahoo, research indicates 60 percent of clicks occurred in natural search results, the rest in paid search ads. Companies seeking to maximize returns must be found on both sides of the SERP."

Google

Defininition: 'hubs and authorities' : Teoma, Ask Jeeves , HITS and Clever Mike Grehan's eMarketing News - Internet Marketing Tips

Mike Grehan's eMarketing News - Internet Marketing Tips:Interview by Mike Grehan with Paul Gardi, SVP Search at Ask Jeeves/Teoma, and Alexa Rudin, Director of Communications at Ask Jeeves.

Main topic discussed is social network theory. which is about understanding how people interact and how networked structures are predictive of certain links, like hubs and authorities." This goes further than the initial academic citation model....

"Kleinberg...is... the developer of an algorithm known as HITS (Hypertext Induced Topic Search). The intuition behind HITS is very important as it's based on the notion of 'hubs and authorities',"

Main points:

Quote"

o Authority comes from in-edges (pages which point to yours)

o Being a good hub comes from out-edges (pages which you point to)

This creates a mutually reinforcing relationship:

o A good authority is a page that is pointed to by many Good hubs.

O A good hub is a page that points to many good authorities.

However, it's vital to remember that, this process is a way of, not just identifying linkage patterns, but also identifying web communities and the major players within them...

..with search engines, some links are certainly more equal than others: and some are infinitely more equal.

This is why we talk about "link quality" and not just quantity. A single quality link can frequently have 50 times more power than 100 random or "less qualified" links.

Both Google and Teoma are prime examples of search engines which base their ranking algorithm around the nature and the characteristics of linkage data."

Teoma's algorithm is based on HITS but develops it further by amalgamating with another variation on the algorithm called CLEVER, which was an IBM project.

Quote on Ask Jeeves & Teoma:"The combination of algorithmic search and other data we have to identify structures is incredible. What we do ensure with paid inclusion though, is that it has no impact whatsoever on relevance i.e. paid inclusion is guaranteed entry to the index, but no priority or preference is shown. We maintain absolute integrity within the ranked results. We've come so far with all of this research into structures and hubs and authorities in order to be able to determine exactly what are the authoritative sites. So we're all about absolute relevance. If you see the shift in Jeeves - Jeeves has come along way in terms of relevance....If it's just about being found, then you can try submitting if it's free anywhere, but if you're linked well - we'll find you eventually anyway..."

On flash: Quote Flash, through the methods we use, we'd be able to find that page more easily. (Comment: By analysing link patterns of hubs and authorites) We would have a better picture of what that page is and what it's about than most. But more to the point, we'd understand why people were looking for it. Even though it's a Flash page. Whereas, if our crawler looked at that Flash page - it just sees nothing... maybe just a couple of words that really aren't relevant...

Paul: They're learning all the time. That's why no one can know what they really are. These algorithms are continually being tweaked and tuned...
o Mike: Is it too far fetched to start and imagine these machines beginning to start and think for themselves?

o Paul: This is real, what's actually happening. It becomes an interesting philosophical discussion...

Paul: How intelligent are these machines? Think about your brain and think about how things work, like how do we make decisions? So how could a computer make decisions? How different is that? In the same way, I, guess, that the brain uses facts, figures, intuition.

What the algorithms are doing is gathering this type of information. And in the case of Teoma, it turns out, because we're going down to the level and depth of information we are doing, it happens to be extremely valuable information. This is very targeted stuff and these machines are very smart and they can find this very valuable information and then process it at these almost unimaginable speeds. No one could have imagined this... It is artificial intelligence. Our job is to put all of that information, to input it to the engine so that it gets smarter and smarter and has more and more to think about and things come together in a better completeness.


On Spam
Quote "Spam is a significant issue on the web because it affects our user experience and that's what we are concerned about. If I take another perspective on it, the truth is, Spammers spend more time at looking at these methods, when if fact, if they spent more time creating great content, they'd score anyway - without fear of retribution. You know, we have ways of dealing with Spam. We don't talk about them, naturally. We're successful where we have to be. We always see new techniques being used and we watch it, and if we don't like it - why would anyone else?"

On "Good SEO"... NB this metaphor summarises the thinking behind authority sites and hubs...

Quote " I'd say the same thing as I'd say to some guy who's just arrived in town and said, you know... I need a job. I'd say: "What are you good at?"

And then what kind of advice would you give that person? Well this is exactly the way the web works. I'd say knock on a few doors. Go and meet people. Talk to them and tell them what you're good at. And the ones who will like you for that will put you in their address book or their filing system so they have a record of you. They can then refer to you when they have a question, or refer you to someone who may be looking for whatever it is you do. You become known for something and you become a member of a community. You mix in that community.

If you're good at tennis, you join the tennis club. You become a member of a community and if you're a genuine value provider, of genuine interest they will treat you and accept you as part of the community as well. And it doesn't take as long as some people think it takes. The structures are already there, we're refreshing the pages all the time. If a new page comes up, we may not find it straight away, so you can use paid inclusion so we actually know you're there right at the beginning. And if you've got some links on your page going out, we'll see where they are going and we'll start understanding what community you're in and... well I can't go much further, but that's the key to success. Certainly at our level of understanding. We're mostly concerned with, are you in communities and are they good communities. Are you an authority or not an authority? And by that I don't mean that you have to be the leading authority at something, it's just about being weighted as part of that subject community.

Of course, we don't know what the subject is, we simply can't know exactly what all these subjects are, but when someone types in a word [or words] that brings up that community, as a subject specific community and related to that word [those words] then you're part of that and we know. And you'll be found quicker than you might think. It's not that hard to do. And communities don't have to be that big. If it's a good community on a subject...

On themed web sites, the theory is:
Quote " that if you have a web site which sticks to one theme and one theme only, which is centred around a few keywords then this is the ticket to success. A themed web site wins by pure mass, or dense aggregation, or something...Paul: You mean creating page after page on the same subject? Again, they're focusing on the wrong thing...

o Mike: Let me jump in again and put it this way: "Does the guy who has a blue widget web site with 100 pages beat the guy who has only one page - but one very IMPORTANT page?

o Paul: No the larger site does not do better: Because we don't count the number of pages. We care about this: Are other pages on the same subject considering this to be a GOOD PAGE. And you know, even Google and what they do and the other methods, they can't do this. Sure, they do look at who's referring to the page but they don't look at the subject - the subject of the page. Yes, we look at all the information that the others do as well as everything else...If you want to be prominent then simply become known on your subject. Become good at what you do, become valuable to somebody else online for something. Go ahead and optimise your page - but don't make stupid mistakes... If you're selling, I don't know, window dressings, just make sure you've got a term on there that says "window dressings". You know, there are many people who make that kind of stupid mistake by not having the actual text on the page. And we are matching text at some level...the main point: Become a member of your community. It's not so hard. If you're about something commercial there are many places you can go to get noticed. And then, of course, someone linking to you, well that's a good reference....

Again, it's essential to come out of the realm of only thinking about this as the realm of ones and zeros, like it's only complex mathematical equations and very complex architectures and just think about it this way: How do I relate to other people and organisations?...



Paul: Philosophically our approach is the same - or similar should I say. (Comment: as HITS and Clever) But the methodology is not the same. In fact completely different.

BECOME A "PARTNER FOR PRIVILEGED INFORMATION"... QUOTE:" Paul: We're already developing relationships with selected partners. We have our partners in the paid inclusion program. And these really are trusted partners. And any partner who violated that trust would get notice immediately. Because we allow them to provide us with information that expands our ability to understand what's there, we have to be certain that it's the right information as it's just slightly below our defences.

o Mike: So you've got two levels here: you've got third party suppliers who work on the pay for inclusion side. Whether that's subscription or an XML trusted feed like at Position Technologies. And then at the next level you've got the guys with the search engine marketing firms, the smaller agencies (and the larger for that matter), can they just apply to become a partner?

o Paul: Absolutely, but I do have to say that we limit the number because we can't manage that many ourselves right now. Personally, I'm always open to new partners coming in. We'll work with them as long as they meet a minimum threshold. If they prove themselves to be good partners and valuable assets to their own customers, which is very important to us, then we can choose to work with them on a more permanent basis."

On Teoma brand: QUOTE:

"Paul: Well sure, it's Teoma versus Google and all the other engines in the market place. Whether it's branded Teoma and delivering results to Ask Jeeves and not being recognised is not that important as such."

Algo and beyond: "Beyond the algo, incorporate search data: Quote: " Paul: Yes, there is a level of understanding which goes beyond the algorithm...Ask Jeeves for instance, we layer in what's called 'The Knowledge Base'. When we see an opportunity which we consider to be statistically inside a range where we know that somebody is asking for something where we know we have additional knowledge then we pull that to the top. Basically we analyse and look at the GUI [graphic user interface] very closely. We have the Direct Hit technology...

Click popularity, as it has been known, is a very important aspect of how to rank pages...

the large players are realising the power behind the algorithm. Algorithms scale more efficiently, more predictably than humans if you think about. Teoma is a great example. Because we can use the 'hubs' we don't need one hundred editors working for us. We have 50 million editors working for us [big grin]"

Interview ends with :" At this point I want to delve more deeply into algorithmic search. Paul is happy to continue the conversation, but says he'd be much more relaxed about it without the tape running. I switch it off and we talk for another 15 minutes during which Paul is very candid. This further information is reserved for the third edition of Search Engine marketing: The essential best practice guide."

Free document about HITS and linkage based algorithms here:

http://www.e-marketing-news.co.uk/topic_distillation


http://www.teoma.com/

Google

Sunday, October 03, 2004

Yahoo: Mike Grehan Interviews Jon Glick Yahoo!'s Senior Manager for Web Search eMarketing News

Mike Grehan's eMarketing News - Internet Marketing TipsMike Grehan's eMarketing News - Internet Marketing Tips:

Jon Glick is Yahoo!'s Senior Manager for Web Search, managing the core relevancy initiatives for Yahoo! Search. Prior to joining Yahoo!, Jon served as the Director of Internet Search at AltaVista and has held positions in new product development, strategic analysis and product management at Raychem Corp., Booz Allen & Hamilton Consulting and the Lincoln Electric Co. Jon has a BS in Computer-Aided Engineering from Cornell University and an MBA from Harvard Business School.

John quote: "to get a good search engine... the best for our end users, everything has to be working well. You know, if you have a great relevancy algorithm and lousy Spam detection you just get a bad experience for instance. You really can't fall down on any of these areas. If you don't have good *de-aliasing tables users get a bad experience. It's all about a lot of things coming together with a very good team. And I think that's what the Yahoo! search team has done very, very well.

<>
[Note: Jon uses the term de-aliasing in reference to knowing that something such as www.coke.com and www.coca-cola.com are the same content. If Yahoo! were to show both URL's following a search on 'coke' then the user wouldn't be getting the diversity of results which would be optimal. He's also happy to point out that a search for 'coke' at Google is representative of the problem!]...

And our first goal, as I've said, is to give our users the best experience: full stop. Without that, nothing else really matters. They're the engine that drives everything. But we do also realise that the people who create pages, the content providers do have a curiosity about what they're doing that's working; what they're doing that isn't working... And this is part of transparency. So, we try and give that kind of fuel bar score in the same way as we'd try and answer questions in a forum. We want people to be able to do the right things. It's something we're considering along with a lot of other things. And if it makes sense, we'll roll it out. The other thing is... well, you mentioned that we'd touch on personalisation. For me it seems as though there have been two phases in search. The first phase was all about what was on the page. The second generation of engines started to look at what it was they could find out about that page by looking at what else there was on the web that gave more information about it. The directory listings, the connectivity, the anchor text etc. And we're still in phase two.

For me, and this is me speaking personally, the next phase will be where you're able to take into account information about the user. And of course local, because local search is a subset of personalisation. For local to really work, you need to know where the person is. So, the issue of: "I'm number one for this keyword"... may not exist at all in a few years. You know, you'll be number one for that keyword depending on who types it in! And from where and on what day... and... It is going to get more complex than something that can simply be summed up in a ranking algorithm, let alone how many checks somebody has on a toolbar.

Comment: The promised clarity and feedback to webmasters etc has not been in much evidence. The fuel bar score has also been disabled.


Site Match
There are three components to the Site Match program.

1) Site Match " the basic per URL submission. It's a subscription charge plus a cost per click. We do this for a number of reasons. If you take a look at what you would have had to have done to get into all the individual subscription programs, Alta Vista Express Inclusion, Inktomi Site Submit etc. You'd generate a subscription fee of over 150 dollars. But now the base fee, for the first year is 49 dollars and then drops for subsequent URL's. So it's much more economical. Especially for a small site that wants to get across a large network. Also, it means that people who are going into a category where they're going to generate a lot of traffic where there's very high value, they have a chance to do it on an ROI basis which they can measure. So it's a more tuned program that we're offering."


2) Public Site Match "This is where we take high quality feeds from Governmental sites, not for profit organisations, Library of Congress and that type of source. This helps to improve the comprehensiveness of our index and also...

How does this XML feed, or "sheet feeds" as they're known, which is basically meta data, blend with the ranking data from a crawl? I mean the feed is data about data, it's not actually being crawled at all. How do you know which is which and what about the linkage and connectivity data...

Site Match Xchange program o Jon:

"We still have connectivity values for the sites because there's a lot of information that we take from the free crawl which factors in.
For example, an individual eBay auction may not be linked to. But we know what the connectivity score is for eBay on aggregate. So we can take that into account. And as part of the Site Match program, editors are going through and making sure that there is quality to the content and evaluating the quality of that content. For example, pages which are included in the Site Match Xchange program have to have unique titles and they have to have meta data. Things which are not necessarily requirements for a page to be free crawled out on the web. The standards are actually higher because our goal is simply to add quality content. The intention of the entire Site Match program is to increase both the comprehensiveness and also the relevancy of results to our users. We run our own tests to monitor user behaviour. What links users click on; do they click higher on the page... when are we giving users a better experience...

o Mike:

Just before I forget to mention it Jon: What about the Yahoo! directory and the 299 dollars for inclusion?

o Jon:

That does still exist. The Yahoo! directory is there for the different ways that people decide to look for information on the web. Some people like to parse a hierarchy, some people want to find other sites that are related within a certain category. And other people take the more direct route of: "I know what I want, I know the keywords..." and they just go directly to the search.

If by benefit you mean ranking - no there's not. It's an inclusion program. It is just about inclusion. It gives us an opportunity to use resources to go through and give them an editorial review of their site and puts them on a one-to-one relationship with the folks at Yahoo! And if you go to Site Match Xchange then you get some good customer service support. It's not going to do anything to influence their ranking. But let's take an example of say, a travel company. The Yahoo! Slurp crawler typically is going to come around and visit a site every three to four weeks. If you're a travel company... two weeks ago you wanted to sell Mardi Gras Getaways. But that's finished and nobody's buying those breaks now. It's Spring breaks for college students maybe. Now if your content changes that dramatically, having us come back and crawl your site every 48 hours may have a significant impact on your business. If you have a page which doesn’t change much, like consumer electronics... standard web crawl may be fine. There's a guy who came to see me earlier and he's doing an art exhibit and they won't have the pages ready until a few days before they're in each city. So waiting for the free crawl to come around may mean that they're not in when they need to be. It is an additional service and if it makes sense for people then they're welcome to take advantage of it. If they're happy with it and they're positioned well and have the crawl frequency, then use it. People who don't use the program will never be disadvantaged in the rankings as compared to other people who do."

Meta data and Yahoo!

"Yes we do use meta keywords. So let me touch on meta tags real fast
.
We index the meta description tag. It counts similar to body text. It's also a good fallback for us if there's no text on the page for us to lift an abstract to show to users. It won't always be used because we prefer to have the users search terms in what we show. So if we find those in the body text we're going to show that so that people can see a little snippet of what they're going to see when they land on that page. Other meta tags we deal with are things like the noindex, nofollow, nocache we respect those. For the meta keywords tag... well, originally it was a good idea. To me it's a great idea which unfortunately went wrong because its so heavily spammed. It's like, the people who knew how to use it, also knew how to abuse it! What we use it for right now is... I'd explain it as match and not rank. Let me give a better description of what that really means. Obviously, for a page to show up for a users query, it has to contain all the terms that the user types, either on the page, through the meta data, or anchor text in a link. So, if you have a product which is frequently misspelled. If you're located in one community, but do business in several surrounding communities, having the names for those communities or those alternate spellings in your meta keywords tag means that your page is now a candidate to show up in that search. That doesn't say that it'll rank, but at least it's considered. Whereas, if those words never appear then it can't be considered...

how many keywords do you put in a meta keywords tag before you start to flag yourself up as spamming?

o Jon:

Okay here's a couple of parameters. Each keyword is an individual token separated by commas. So that's that. You want to separate these things with commas and not just put one long string of text. The more keywords that are put in and the more they're repeated, the much larger the chance our spam team is going to want to check out that page. It doesn't mean that page is going to get any specific judgement. But it is very much a red flag. For best practice you just need to remember it's for matching - not ranking. Repeating the same word 20 times is only going to raise a red flag... It doesn't increase your likelihood of showing up on any given set of search results. It's just a risk with no benefit.
"So I could put, I don't know... er... for instance, ‘laptop computers, desktop computers, palm computers...’

o Jon:

Exactly, and, of course, since each of those is separated by commas, then ‘laptop computers’ will count for ‘laptop computers’ and not ‘laptop’ or ‘computers’ separately. So doing it like that means that you're not going to be penalised for keyword spamming on the word ‘computers’.

o Mike:

Okay, let's take the description tag now. That gives us a little bit of editorial control still?

o Jon:

The description tag does give you just a little bit of editorial control, depending on what your body text looks like. Ideally we like to find the keywords the user typed in your body text. But this can be a very good fallback for search engines in the event that you have something like, for example, an all Flash page which can't be well indexed, in terms of text by search engines... "


Spam

Quote: "Alright then Jon: It's been mentioned again. The dark side that is. Let's talk Spam! Of course it's a huge problem with search engines. People who are creating web pages in the industry worry so much about what they're doing with the pages and how they're linking and submitting... and will I get banned... I get asked a lot of questions like: "If I link to my other web site will they know it's mine and ban me?" Or: "My hotel is in New York, New York, will I get banned for keyword stuffing?" Crazy worries. I guess for most of the smaller businesses which aren't up to speed with search engine optimisation, they hear a lot of propaganda which worries them. But at the other end of the scale, I tend to hear more from you guys at the search engines about the activities of less ethical affiliate marketers out there. Now those guys certainly live by their own rules. How do you deal with it?

o Jon:

Well let me just say first that, in that sense Spam has gotten a lot better over the years. You don't really much have people trying to appear for off topic terms as they tended to.

How Yahoo! deals with affiliate sites and duplicate content:
Quote: "You now have people who are trying to be very relevant. They're trying to offer a service, but the issue with affiliate Spam is that they're trying to offer the same service as three hundred other people. And the way we look at that is... we look at that the same as we look at duplicate content. If someone searches for a book and there are affiliates in there, we're giving the user ten opportunities to see the same information, to buy the same product, from the same store, at the same price. If that happens, we haven't given our user a good service or a good experience. We've given them one result. So we are looking at how we can filter a lot of this stuff out....

There are a lot of free sign up affiliate programs. They've pretty much mushroomed over the past few years. The plus side is, they're on topic. They're not showing up where they shouldn't... it's the other way... they're showing up too much where they should [laughs] We look at it like this: what does a site bring to the table? Is there some unique information here? Or is the sole purpose of that site to transact on another site, so that someone can get a commission... if that's the case, we'd rather put them directly in the store ourselves, than send them to someone else who's simply telling them how to get to the store.
o Mike:

You guys must get Spam reports the same as all the other engines. So when somebody does a search on a particular product and it turns up that there are ten affiliates in there, whether they're Spamming or not, it's likely that the affiliates could be turning up before the merchant ever does. If you get a high level of that occurring, do you ever go back to the merchant with some feedback. You know, say like, guys do want to optimise your web site or just do something about your own ranking?
o Jon:

We do actually talk to a lot of companies. We obviously have a relationship with many of them through the various Yahoo! properties. Different companies often take a different tack. For instance, a company which has been very, very good on listening to us is eBay. I have to say is a company which has been very good at working with us and listening to us on the affiliate issue. Their feeling is really twofold: One is, the people that are confusing the results in the search engines are the same people who are doing things that they don't like on eBay. And for them they tend to see bad actors in one space and bad actors in another. The other thing, of course, is if you have someone who is using a cloaked page, and so, to a search engine it's a huge bundle of keywords and massive interlinking of domains on different IP's and for a user coming in with IE 5, it's an automatic redirect to pages on eBay... they know that the user doesn't think: "Oh it's an affiliate Spammer. The perception for the user it's simply this: eBay tricked me! There's a link that I clicked that said "get something free" I clicked it and ended up on eBay. And they wonder why eBay would do that to them. And they know that those things hurt their brand. So that's why they have been very proactive in working with us to ensure that those kind of affiliates are not part of their program.

But... some other merchants may look at it and say: since we're paying on a CPA (cost per acquisition) basis we're actually indifferent as to how that traffic comes to us. They may say, it's like, we don't want to monitor our affiliates, or we can't monitor our affiliates... whatever, we'll take the traffic because there's no downside. It's a different way that they may look at it. And you know, it depends what position they're in, and more, how much they care about their brand, or don't care...

o Mike:

And a similar kind of thing happens on the paid side. I don't want to get too much into that because this is the organic side and I don't want you to get too embroiled in that as I don't know if you're much connected with it. But in PPC with a campaign you can only bid once on the same keyword. It's not possible for you to fix it so that you can turn up at one, two and three on the paid search side. So, what tends to happen there is that, the merchants don't mind if the affiliates are bidding on the same keywords. So one way or another, it's likely that, if they can't hold all the positions down the right hand side, the affiliates will help them. And at least that way they get the sale anyway.

o Jon:

The downside of that for some of them... I actually covered this in a session yesterday. They’re competing with their affiliates who are actually bidding up to what their zero margin is on their CPA against the cost of those bidded clicks because their landing pages were just like.. you know, one page with a link on it that said: "Click here to shop at Nordstrom." And their marketing spend was actually going up. They were paying people to get traffic that they were likely to have gotten anyway. And they need to roll that back. It may make some kind of sense for a product. But it often doesn't make sense for a brand. It's like, people are probably going to find their own way to your brand name on their own without the affiliate inserting themselves in the value chain. In that case, unnecessarily. And I think people are getting a little more savvy about their affiliate programs. Now they're thinking more about here's what you can do - here's what you can't do. Now they're thinking a bit more about the ways that affiliates can give them distribution. Here are ways that can optimise sales or hurt the brand. They know that people don't view them as affiliates, they view them as their representatives. If you make lousy pages for people it reflects badly on the brand.

o Mike:

So, to finish off the affiliate and Spamming fear factor... because your lunch is getting cold... if for no other reason [laughs] What is it that gets you banned - if at all? Is it cloaking, mini networks...o Jon:

Mike there isn't an exhaustive list. There are new technologies coming out all of the time. At the highest, or fundamental level, someone who is doing something for the intent of distorting search results to users... that's pretty much the over arching view of what would be considered a violation of our content policies. In terms of specifics... um.. let's do some notes on cloaking. If you're showing vastly different content to different user agents... that's basically cloaking. Two different pages - one for IE and one for Netscape with the formatting difference between those, or having different presentation formats for people coming in an a mobile device perhaps, or just different type of GUI that's acceptable. That's helpful.

o Mike:

What about a Flash site with cloaked text pages just describing the content - but a true description of the content.

o Jon:

Exactly. For a Flash site which has good text embedded in it. And the cloaked page simply says the non cloaked page has the following text in it... no problem with that. That being said, if someone cloaks the content, that will raise the red flag. The Spam teams are going to look at it. And if what they see is a legitimate representation of the content that's fine. If what they see does NOT represent the content, I mean something entirely different to what the users would get.. they're going to look at that and probably introduce the penalty.
o Mike:

Linkage data...
obviously people are going to do this... they know that links count with search engines, maybe not exactly why though... so the quest begins to get links... any links. Some will buy a thousand fake domains and have them all interlinked and pointing back to the main site...

o Jon:

Yeah. Massively interlinked domains will most definitely get you banned. Again, it's spotted as an attempt to distort the results of the search engine. The general rule is that we're looking at popularity on the web via in-links. The links are viewed as votes for other pages. And part of voting is that you can't vote for yourself. And people who buy multiple domains and interlink them for the purpose of falsely increasing popularity, are doing that, just voting for themselves. And the same applies with people who join reciprocal link programs. Unfortunately there are many people who join these because they're fairly new to search engine marketing and maybe someone tells them that this is a great way to do things. That's very dangerous. People linking to you for financial or mutual gain reasons Vs those linking to your site because it's a great site, a site they would go to themselves and would prefer their visitors to see, are doing it the wrong way. Let's just take the travel space again. Someone who has 30 pages of links buried behind the home page, literally each with several hundred links, with everything from... golf carts, to roofing, to... who knows. You know that's kind of like: hey if you like our travel to Jamaica site, you may also be interested in our roofing site... [Mike and Jon burst out laughing here]

o Mike:

It's a shame really. People seem so desperate for links but frequently just have no idea where they're going to get them from. It's my mantra over and over again, and I know you've heard me saying it many times at the conferences: the importance is in the quality of the links you have - not the quantity. And of course, everyone wants to do incoming links. They don't want to do reciprocal linking. They even worry too much about whether they should link out themselves. Getting links in is a lovely blessing, but should people worry too much about linking out?

o Jon:

The thing to remember here Mike, is about who you're linking out to. If you hang out in bad neighbourhoods as we say, then you will get more scrutiny, that's inevitable. If you end up linking to a lot of people who are bad actors and maybe have their site banned -- then you linking to them means you're more likely to be scrutinised to see if you're part of that chain. The other thing, of course, is, when you take a look at connectivity, every site has a certain amount of weight that it gets when it's voting on the web and that is based on the in links. And they get to distribute that...energy... via its out links. And by that, I mean outside the domain
.

Navigational links and other links within a domain don't help connectivity, they help crawlers find their way through the site. I'm just talking here about the true out links. Those outside of the domain. For those... how much each link counts is divided by the number that exists. So if you have a couple of partners, or suppliers you're working with and have an affinity with, if you link out to them - then that helps a lot. If you have... 3,4,5 of them... well if you added 300 random reciprocal links, then you've just diluted the value of the links that you gave to the other people you have the real relationship with. It's as simple as this, people who have massive link farms aren't really giving much of a vote to anyone because they're diluting their own voting capability across so many other people. So you need to consider the number of out links you have on a page, because each additional link makes them all count for less.
o Mike:

Jon... I feel as though I've virtually exhausted you. This has been so useful and I really do appreciate the time you've given, not just to discuss your own Yahoo! properties but for giving such a wonderful insight into search engine marketing best practice. I honestly believe your contribution here will help the entire readership, at whatever level they're at in the field, to have a more comprehensive knowledge. Thank you so much.

Jon:

No problem Mike. Anytime at all. It's always good to talk with you.

<>
Jon Glick is Yahoo!'s Senior Manager for Web Search, managing the core relevancy initiatives for Yahoo! Search. Prior to joining Yahoo!, Jon served as the Director of Internet Search at AltaVista and has held positions in new product development, strategic analysis and product management at Raychem Corp., Booz Allen & Hamilton Consulting and the Lincoln Electric Co. Jon has a BS in Computer-Aided Engineering from Cornell University and an MBA from Harvard Business School.

Intermission Time - Go get a drink, take a break, but come back! There's more good stuff to come!

Study Shows How Searchers Use The Engines
by Christine Churchill

Usability has always been one of my favorite subjects, so when Enquiro published a new study showing how users interact with search engines, it was a must-read. The study turned out to be such a fascinating report, I had to share it.

Gord Hotchiss, President of Enquiro, and his team of able research assistants ran 24 demographically diverse participants through a series of tests to observe and record their behavior as they interacted with search engines. While everyone will agree that 24 is not a statistically significant sample size, I think the results of the project show interesting findings that are worth considering.

As I read the study, a number of his findings in user behavior correlated with other studies I've read. For example, Gord mentions that almost 60% of his users started with one search engine (usually Google) and then would switch to a different engine if the results weren't satisfying. This finding is consistent with data from ComScore Media Metrix that talks about user fickleness toward search engines. CNET writer Stephanie Olsen did a great job summarizing that data in her article on search wars . The message to the search engines is "Stay on your toes guys and show us relevant results or we're out of here."

The Enquiro team found that there was no consistent search method. Everyone in the study did it a little different. People doing research used engines differently than people on a buying mission. Women searchers differed from men in their searching techniques. Gord tells us "an organic listing in the number 8 position on Google might not have been seen by almost half the men in the group, but would have been seen by the majority of the women." Let's hear it for women's powers of observation!

One finding of the study that is near and dear to every search engine marketer's heart is, "If no relevant results were found on the first results page, only 5 participants (20.8%) went to the second page."

This is consistent with numerous studies documenting that users don't go very far in the results pages for answers. Probably the most famous research to document this behavior was the study by Amanda Spink and Bernard Jansen where they found 58% of users did not access any results past the first page. I had the pleasure of talking with Amanda a few years ago when I was first moving to Dallas and she was moving out of it. She's a fun lady with a flair for writing provocative titles to research papers on search engines. Expect to hear more from her in the future.

A finding that warmed my longtime SEO innards was that there was a "sweet spot" for being found on a search engine's results page and that place was in the "above the fold organic results," that is to say, in the portion of the free listings that can be viewed without scrolling. Considering how cluttered some search engines results pages are getting this is good news! According to Gord, "All 24 participants checked these 2 or 3 top organic rankings."

I suppose it shouldn't be too surprising to find the "prime real estate" in the middle section of the page, this is consistent with eye tracking studies that show the center column to be the first place user look on a web page. Of course, one might wonder why users tended to skip over the category and product search lists? Gord's team asked users about why none of them bothered to look at the news and shopping feeds that appear at the top of the organic results. Users said they didn't know what they were.

I had a déjà vu moment when I read that because this is almost identical to a comment that was made to me by a usability tester in an in-house usability test. My tester said they skipped over the product search section because they were unfamiliar with it and it "looked confusing". They jumped straight to what they recognized as "safe" - that being the organic list of results.

Another finding I found myself agreeing emphatically with was that top sponsored positions had "a 40% advantage in click throughs over sponsored links on the right side of the screen". It makes sense when you think about it - the spot is so in your face - users can't miss it. The fact that this spot produced a great click through was a well known PPC insider secret and many of us who do PPC management had devised elaborate methods to get our clients in those top spots. We've been hearing evil rumors that Google may be phasing this spot out in the future. It was still there today when I checked, so maybe Google is planning on keeping it awhile.

A finding that could be affected by Google's recent ad overhaul was that users of Google were more likely to resist to looking at sponsored ads than on other engines. Part of the answer explaining this has to do with Google ads looking more like ads than on other sites - hey, they were in little colored boxes off to the right that practically screamed "Ad!" You couldn't possibly mistake them for content or organic results. Since Google has dropped the little colored boxes and gone with plain text for the ads, one can't help but wonder if users will be less resistant to ads now.

The Enquiro study includes a summary section toward the end of the report. Here they identified items that captured the searchers' attention enough to make them click and listed important items to include on a landing page. I won't give away the store by telling you everything, but I will tell you, as you may expect, the title and description shown in the results page were the most important eye magnets for attracting user's attention.

Perhaps the most intriguing of the report findings was that search is a circular and complex process, not a linear process as we sometimes like to simplify it into. Instead, search is a multi-step process with multiple interactions with sites and search engine results pages. Gord's team found that "a typical online research interaction can involve 5 to 6 different queries and interactions with 15 to 20 different sites." That's a lot of sites and a lot of back and forth between sites and search engines.

The takeaway point from this study is that search is definitely more complicated than at first glance. I guess that's what makes search marketing so absorbing. For every thing you learn about it, there are ten more questions yet unanswered. Sounds like we need a sequel to this report - eh, Gord?

Check out the study yourself by downloading it off the Enquiro web site. It's a fascinating report and it's only 30 pages including lots of pictures. Happy reading!

Google

Friday, October 01, 2004

Yahoo! Review on Search Engine Showdown

Yahoo! Review on Search Engine Showdown: "Review of Yahoo! Search
Last updated Sep. 25, 2004.
by Greg R. Notess."

Google

Review of A( Is It Really A9 Out Of 10? SiteProNews:

SiteProNews: Is It Really A9 Out Of 10?: "results on A9 have been 'enhanced by Google' but after some extensive search testing it is obvious that A9 is merely a technology-laden jacket for Google results."

Google

Fallout from Yahoo! Search : WSG Newsletter

Overview of Yahoo history post switch to own technology from a Newsletter by Gwen HarrisWSG Newsletter: Fallout from Yahoo! Search: "Today versions of Yahoo! Search are used at AltaVista, Alltheweb, and, through the former Inktomi customers, MSN, Hotbot, Hotbot.UK, Lycos. Its a web-search monoculture."

Google

Overview main search stories Internet News

By gwen.harris @sympatico.ca Internet News"Changes at the major search engines and subject directories.

The main competitors continue to be Google, Yahoo and MSN with Ask Jeeves and Amazon making strong showings with new and interesting features. There are several others in the wings.

- Google has been hiring staff and there are rumours that it will be releasing its own browser and building a new database."

I asked: Do the site log files show anything similar to the forum posts reporting increased Google bot activity and/or a new Google bot?

Various SEO forums are exhibiting a flurry of posts similar to the old Google Update Syndrome.

Worryingly the favourite speculation at the moment is that Google is gunning for big sites with 1,000's of pages with similar content and for those using cloaking. As ever speculation is rife and evidence thin so any current info from totaltravel logs may prove crucial if Google SERPs and traffic change.

Example forums:

http://www.webmasterworld.com/forum3/25897-15-10.htm

http://www.seo-guy.com/forum/showthread.php?t=3179&page=2&pp=10

"- Yahoo has made its new search engine more stable but also seems to be sidelining the directory. Most effort was probably going into the revisions at MyYahoo. There are also some new shortcuts, mainly of interest to people in the US."
I also note that YAhoo has a beta version of HP now online..

"- Ask Jeeves has almost doubled its database and has been adding personal research features. The new MyJeeves makes saving results easy. Take the tour at ask.com.

- Amazon's A9 has added content and more personalization for saving searches and results."

Google
Creative Commons Licence
This work is licensed under a Creative Commons License.