Friday, May 28, 2004

What Lies Ahead for Local Search Engine Technology?

What Lies Ahead for Local Search Engine Technology?: "What are some of the challenges search companies face with local search?

[AF] Search engines are developing ways to disambiguate and adequately address location-specific queries. Geo-targeting Web search content, both organic and paid, requires search engines to better understand users and queries, inferring local intent by extracting geo-signals and leveraging implicit and explicit user profiles. Taking local search marketing services to market is also very different than selling paid listings to online businesses. The vast majority of local businesses still don't have a Web site, nor the time and expertise to invest in managing sophisticated auction-type listing campaigns."

PPC Overture and Google go one step further, suggesting forecasted traffic levels and cost estimates for specific keyword combinations, match types and bid amounts. In a yield-driven context, where content targeting gets more sophisticated and matching more scientific, Paid Inclusion and Paid Listing programs will eventually merge into more automated bid-for-traffic models. Ultimately, advertisers will target impressions by dictating an ROI level acceptable to them such as "8% over advertising spend". To meet these requirements, search engine marketers will increasingly rely on automation tools to target the right content to the right users at the right location at the right time..

Trends; One of the most significant developments currently underway in web search is the integration of search capabilities within a broad range of other services.

Let's look at commercial searches and informational searches; do you see the two becoming distinct categories?

[AF] No. A central theme behind classical information retrieval theories is that users are driven by an information need. More granular search log analyses over the past years have attempted to categorize queries as "transactional" (Commercial), "informational", and "navigational". The immediate intent behind "navigational" queries is to reach a particular site; "informational" queries aim at acquiring information assumed to be present on web pages; while "transactional" queries usually result in some activity such as an online purchase. Andrei Broder, while chief scientist officer at AltaVista in the late 90's demonstrated that queries at the time were roughly split equally among each category

search has a future on a cell phone?

[AF] Sending local content such as yellow page listings, directions, maps and business ratings to mobile devices just makes sense

Google

Search weariness may finally be setting in & Users & paid content

LLRX -- Coming Soon � the Death of Search Engines?
Second Quarter Scores: Covers consumer satisfaction with e business, portals & Search Engines..

Users & paid content: As they have come to expect the Internet to provide instant delivery of just-in-time information on a 24/7 basis, users now expect information to always be there. They leave even less time than they used to for finding information. And as consumers become ever more accustomed to paying for time-saving convenience to help them deal with other aspects of their lives, it’s reasonable to think that they may pay for the convenience of information that can be delivered instantly, with no muss or fuss.

Google

Thursday, May 27, 2004

Search user research

Search Engine User Attitudes: "Over the past few weeks, search engine marketing firm iProspect has released a series of reports studying search behavior. That survey, Search Engine Users Attitudes, involved 1,649 people surveyed at the end of March 2004...

Covers attitudes to:
Usage
Loyalty
Tool bars
Search Failure
Relevancy: Paid Versus Free

Enquiro has also been busy.Inside the Searcher's Mind: It's a Jungle in Here!: "Inside the Searcher's Mind: It's a Jungle in Here!" Similar to iPropsect, it has recently released results from a survey of hundreds of people about how they interact with search, as well as a focus group look"

a strong tendency to skip past the sponsored listings and go directly to the organic results. Less that 20 percent of the participants were confused about what was a sponsored link and what was an organic link.

Google users were the least confused about what was sponsored and what wasn't on the results page. The greatest confusion was found amongst MSN users.

the majority of users (19 out of 24, representing almost 80 percent of the group) tend to skip over sponsored results and go first to the top organic results. If the users find something relevant in these results, they may never return to the sponsored listings.

Outlines Four Types Of Searchers

Concludes:
users are much more likely to use a search engine during the research phase of the buying funnel. Usage of search engines drops off as the user draws closer to the actual purchase transaction.

This was echoed in the focus group, where 68 percent of participants indicated they would use a search engine to help research a purchase, but only 41 percent indicated that they would purchase an item online, and only 28 percent indicated they would use a search engine to help them make this purchase.

It's important for marketers to understand where in the buying funnel their customers are most likely to use a search engine to help in their purchase.

If it is primarily in the research phase, than searchers are looking for distinctly different things than they would be if they were using a search engine to make a purchase. The marketer may be trying to capture a click through by promoting free shipping or discounted prices, while the consumer is looking for information on product features, consumer reviews and competitive comparisons.

Google

Tuesday, May 25, 2004

yahoo addy for dir corrections Wrong info. in Y! Directory

Wrong info. in Y! Directory: "OK, this should be the correct info regarding Directory questions and modifications:
For questions about a site in the Directory, you should first go to our help pages at http://help.yahoo.com/help/us/dir/ If you still have questions, Please email the Directory Support team at:
url-support@yahoo-inc.com.
This email address is monitored daily. For Express appeals within 30 days of when the site was added or denied, you should respond to the original editor who reviewed the site. The Yahoo! Express Service Agreement states the following:

http://docs.yahoo.com/info/suggest/terms.html

5.2 If Yahoo! accepts a suggested site, Applicant is entitled to request one reconsideration of the placement of the site, comment fields, title, etc., at no additional charge by sending an email to the address provided by Yahoo!. Yahoo! must receive the Applicant's email within thirty days from the date that Yahoo! transmits the acceptance email to the Applicant. If a timely request is received, Yahoo! will again review the Applicant's site and will then notify Applicant of its final decision regarding the site. Once a site is included in the Directory, if Applicant makes any substantial changes to the site, Applicant must comply with Yahoo!'s standard change process form to make any changes to the listing in the Directory. "

Google

A Dropped Site Checklist

A Dropped Site Checklist: "One of the most common themes of posting here in WW starts something like: 'Last night, my site disappeared...'
'Losing' a site can be a painful and frustrating experience. To help ease the pain, perhaps a starting list of potential issues might help. I'll probably miss more than I'm catching with this list, but at least it's a start.
Do a site search at the SE in question to determine if all of some of your pages are gone. Some think that their site has vanished, when in fact an algo update or tweak has occured causing their pages to drop. Or, individual pages have been filtered or penalized, but not entire sites:
If *all* of your pages are gone (search on URL's to check that), then perhaps:
� your server was down at an inopportune time.
� you have a robots.txt problem.
� you've been removed from the index based on a perception of bad behavior (not good).
If only some pages are gone, or if your pages have simply dropped badly in the SERP's, then perhaps:
� you have some other technical issue not noted above (e.g., badly executed redirects),
� the algo changed,
� you've done something recently that the SE did not like, or,
� the algo changed and something that was previously 'OK' is now being filtered or penalized.
Here are some specific things to look at:
Start with the basics: Was your server down recently?
Server failure is always a good item to check off your list when searching for problems. No need to start remaking your site if all that happened was a temporary problem.
Are you using a robots.txt file, and if so, has it changed. , Is the syntax correct?
There are a variety of potential problems that can be caused by improper code in robots.txt files, or placement of the robots.txt file in the wrong location. Search WW on this topic if you're not sure what you're doing. Use the WW Server Header Checker. At worst, a robots.txt file can tell a SE to go away, and you really don't want that. ;-)

Have you more aggressively optimized recently?
Internal changes that can lead to potential problems include:
• More aggressive kw optimization, e.g., changes to Titles, META's, tags, placement and density of kw's, etc.
• Link structure changes, and especially link text changes. Updates to link text or structure, if done for optimization reasons, can push a site into filter/penalty territory. Look in particular for overuse of kw's.

Have you added redirects?
The SE's *can* sometimes become confused by redirects. Assuming that the changes are intended to be permanent, use 301's, not 302's. Be especially careful about large scale changes. If done properly, redircts are important tools. Done without proper knowledge, they can lead to short term pain, often on the order of 1-6 months.
http://www.webmasterworld.com/forum3/8706.htm

Do you have a significant number of interlinking sites?
If ever there was a strategy that might be summed up as: "Here today, gone tomorrow..." interlinking is it. You can succeed with this strategy. But if you add too many sites or links to the mini-net you're creating, or interlink too aggressively, it can catch up to you. Penalties can range from soft filters to complete manual removal in rare cases. Even with no recent changes to your sites, the SE algo's can change, making something that squeeked by yesterday illegal today.
http://www.webmasterworld.com/forum3/4618.htm

Are you linking to sites in "bad" neighborhoods?
If ever there was a strategy that might be summed up as: "Gone today..." linking to "bad" sites is it. If you think that you might be linking to the dark-side, lose that link instantly, if not sooner.
http://www.webmasterworld.com/forum3/8053.htm

Could you be suffering from a duplicate content penalty?
Some practices or occurances that can cause problems in this regard include:
• Use of a single, site-wide template
• Use of one template across multiple sites
• Competitors stealing or mirroring your content
• Redirects from an old domain to a new one
• Over reliance on robots.txt files to exclude bots from content areas you don't want exposed. WebmasterWorld Thread:
http://www.webmasterworld.com/forum3/22494.htm

Are you cloaking?
Some cloak merely to deliver "accurate" pictures of sites/pages to the SE's. Examples of this are sites with lots of graphics and little text. But if you're a mainly text based site that is delivering one set of content to the SE's while users are seeing something less...umm...optimized...then there's always the risk that you've been caught.

Are you using AdWords?
This is pure speculation on the part of some seniors here, but some do seem to firmly believe that if you place highly with an Adwords listing, it might actually hurt your position in the SERP's. Don't shoot me. I'm just the messenger.

IF OTOH, the only issue is that you're not as high in the rankings as you'd like, then a better place to start would be Brett's 26 Steps to 15K a Day.

Google

LookSmart's New Keyword Tools for Advertisers: A Closer Look : May 2004

LookSmart has added a suite of new tools to the Advertiser Center: keyword
suggestions, an expected-positions indicator, and bulk uploading of keywords and
listings. Limited time free campaign optimization offer too ($99 value - click
the second link).
LookSmart : A Closer Look : May 2004: "You can access the Keyword Suggestion tool during the Add Listings process. Enter any keyword and you will immediately generate a list of the top 20 related keywords, ranked in order of impression volume. As the graph shows, one-word search queries are on the decline, so pay particular attention to multi-word suggestions... "

Google

Yahoo review of penalized sites

Yahoo review of penalized sites: "Yahoo review of penalized sites
Believed by Webmasters to be clean"

Give 2 addresses for appealing:
reportsearchspam@yahoo-inc.com
webmasterworldfeedback@yahoo.com

Google

Reasons for a Yahoo Penalty & generic email

Yahoo....situation continues......Reasons for a Yahoo Penalty:

Yahoo is currently sending a generic email to those that inquire about a penalty that outlines the following possible reasons:
- Cloaking (showing crawlers deceptive content about a site)
- Massive domain interlinking
- Use of affiliate programs without the addition of substantial unique content
- Use of reciprocal link programs (aka “link farms”)
- Hidden text
- Excessive keyword repetition

...These reasons are generic at best. What I'm curious to determine is where the lines are and what sort of techniques could trip each of these. of these, the following seem most ambiguous and potentially far-reaching:

1. Massive Domain Interlinking - what does this mean exaclty? What is you use various subdomains for different sections of your site? Does this trigger a penalty? Too many links between each of your pages? And, how many is too many?

2. Affiliate programs without substantial unique content - What exactly is substantial? If a car site uses the descriptions of cars from its affiliate program, as part of the catalog, but the site offers their own significant car guide section - is this substantial enough? Or is there some ratio in play here?

3. Reciprical Link "programs" - this is perhaps the trickiest of them all. Google has long-since banned link farms, but could Yahoo be looking beyond simply link farms to see if a site has too many reciprical links? In a recent quick search of a popular key phrase, I found a suprisingly small number of listings with links pages.

Are certain categories being hit harder than others? Many have pointed out travel, but I've seen other examples as well.
Are the hardest hit categories those that Yahoo eCommerce is competing directly with?


Reasons for a Yahoo Penalty:

"My site is gone from the Yahoo serps for all of my normal search terms.
I think I've received a duplicate penalty because of my Affiliates. I have a Yahoo Store, and each affiliate gets a link like this:
store.yahoo.com/mystore/affiliateID#
Since Yahoo dropped google and started it's own search, these affiliate links have been showing up in the Yahoo serps. Sometimes even higher than my own page. They point to my homepage and would appear to be mirror sites. I've been worried for a while now that the new Yahoo algo was going to see them as duplicate content.
It's amazing that Yahoo SEARCH has done this with their own Yahoo STORES. Yahoo Store's affiliate program creates these URL's automatically to send to our affiliates. Yahoo encourages the store owners to get affiliates, but when we do, they penalize us for duplicate content.
Of course, this is just my guess on why I was dropped from Yahoo, and maybe I'll get lucky and be back in tomorrow. But regardless, Yahoo shouldn't have all those duplicates of my site in their index. Google seems to sort them out just fine. Why can't Yahoo?
I was having the same problem with Inktomi, but even they seem to be sorting them out better than Yahoo."

"There are different reasons some sites and pages have not been appearing in the serps. Sometimes the reason is penalties of one sort or another. A completely unrelated phenomenon of a technical nature has been effecting another group of pages -- usually "pages" and not "sites". Penalties normally apply to whole domains. The technical problem was usually/mostly something that effected individual pages. I certainly don't know ALL the reasons the glitch effected a page, but one is when another page linked to a target page via some sorts of redirects, the target page would disappear from the serps while the "linking to" redirect URL would appear...Notice how this will only affect one page on the target domain. Usually this would be the main page of the target domain, and interior pages would continue to rank more or less normally. Some whole domains might be effected by the glitch though if they had some all-encompassing redirect or duplicate issue that caused them to be mistakenly lost. "

Google

An Insider's View of Microsoft's Longhorn Search

V brief, link to vid An Insider's View of Microsoft's Longhorn Search: "'We talk about good search in Longhorn, but search is a multi-level kind of thing.' Telling comment: '...all the way down to basic Google-like full text searching.'"

Google

New Look In July, New Search Engine Later, Says MSN

New Look In July, New Search Engine Later, Says MSN: "MSN announced a redesign for its MSN Search service last week, a cosmetic change that helps the service comply with US Federal Trade Commission recommendations about labeling paid placement results.
The change being implemented on July 1 will not coincide with the launch of MSN's own search technology. A crawler-based MSN search engine has been in development since last year, but there remains no announced launch date for this.

In particular, there will be two Sponsored Sites areas. One is a box at the top of the search page, giving ads "inline" placement. The other ads will run along the right-hand side of the screen Google-style, in what Search Engine Watch calls "sidebar" placement and what MSN refers to as the "right rail." In both cases, a "Sponsored Sites" heading is clearly displayed.

The main listings on MSN Search's results page -- those under the "Web Pages" heading -- currently come from Yahoo. These are mixture of pages found by Yahoo's crawling of the web and content obtained through Yahoo's content acquisition program, some of which involves paid inclusion.

After the July change, the main listings at MSN will continue to come from Yahoo. They simply will no longer have a "Web Pages" heading coming above them.

At some point in the future, MSN expects to replace this data with that found by its own crawler. As previously said, this won't happen in July. Instead, it's more likely to happen toward the end of the year.

FTC recommendations: As long as paid inclusion doesn't provide a ranking boost -- which MSN supplier Yahoo says is the case -- paid inclusion need only be disclosed via a search engine's help pages."

Google

Monday, May 24, 2004

Spam guidelines reminder: Article covers "Doorway Pages"

How to Spot Search Engine Spam: Doorway Pages:
"Google:

http://www.google.com/Webmasters/index.html
http://www.google.com/terms_of_service.html

Yahoo:

http://docs.yahoo.com/info/guidelines/spam.html
http://docs.yahoo.com/info/terms/

Teoma/Ask Jeeves:

http://sp.teoma.com/docs/teoma/about/terms_of_service.html
http://ask.ineedhits.com/programterms.asp#spam"

Google

Did Bill Gates shake the blogosphere?

Investor's Business Daily: Breaking News: " Bill Gates told Warren Buffett about blogging on Thursday. Gates' endorsement of blogging, Rubel said, is likely to lead to more businesses using it: 'Bottom-up business communication will only gain steam here.' But there's more to the story. Gates' comments were also 'a veiled declaration of war on Six Apart, Userland, Google and anyone else who makes blogging tools.' Rubel's blog is called MicroPersuasion.
Microsoft has indeed been a booster of blogs. More than 700 employees publish the online diaries, often discussing projects and software in development. One of the more ambitious is Channel9, the work of five company employees who 'want a new level of communication between Microsoft and developers.' It includes descriptions of new technology, video interviews of Microsoft program managers and developers, and some gossip"

Google

Friday, May 21, 2004

Design by Fire: Design Eye for the Usability Guy

Compare the immediate visual impact of the 2 sites....

A team of web designers take a playful look at Jakob Nielsen's latest AlertBox on styling links. Design Guidelines for Visualizing Links (Jakob Nielsen's Alertbox): "Guidelines for Visualizing Links" Learn how they went from drab to fab, and even provide a PDF.Design by Fire: Design Eye for the Usability Guy:

"However, the language could use some tightening. Nielsen needs to take his own advice and cut his own contribution to information pollution. While he�s at it, he should drop all the psudeo-scientific mumbo-jumbo. Talk to me, man. Straight up. Whip that content into shape and make your audience take notice by the shear, razor sharp clarity of it all."

Google

'Future of Search Will Make you Dizzy'

'Future of Search Will Make you Dizzy': "'Future of Search Will Make you Dizzy'
By Ryan Naraine

NEW YORK -- Amazon.com's (Quote, Chart) A9 subsidiary wants to play a large part in pushing search technology to a future where the relevancy of search results will be startling and exciting.
That's the word from A9 chief executive Udi Manber, PhD, who insists that full development of search and resource discovery tools remains at least a decade away.

Another hiccup for researchers, Manber said, is that the relevancy of search results is hard to measure. "Relevancy changes all the time and is not well understood. Relevancy is different from user to user. We have to figure out better ways to measure [results] to make it better. That's the hard part. We need a science around measuring relevancy."

"It's not about speed or size anymore. It's all about quality. It's about delivering the tools that allow relevancy. It's good to make searching faster and faster because that part is well understood. The quality part is not understood and that's the challenge we face today," he added.

Manber described "Search Inside the Book" as the most exciting project he had ever worked on and gave an overview of process of scanning 33 million pages from more than 120,000 books while creating the text-search tool. "There were lots of monitoring tools, lots of testing, lots of databases and lots of support for huge data. The entire thing took about six months to launch and it's something I'm very proud of."

The search technology pioneer also outlined the personalization capabilities of the A9 portal, which is integrated into Google and Amazon.com to allow Web and book searches in real time. It is also programmed to save user searches into personalized profiles"

Google

First Search Engine for the Disabled - YouSearched

First Search Engine for the Disabled - YouSearched: "YouSearched.Com is proud to unveil the world's first search engine designed to be FULLY ACCESSIBLE for people with disabilities. Search is the second most used application on the Internet after Email but people with disabilities have not been able to fully utilize the capabilities of search engines."
See for full list of disability guidelines...

Google

Wednesday, May 19, 2004

Overture gets aggressive in AU Making the right overtures - Next - smh.com.au

Making the right overtures - Next - smh.com.au: "Two of the nation's biggest web publishers - Ninemsn and Fairfax's f2 Network - have signed up with Yahoo! subsidiary, Overture, in an agreement which bumps aside deals the two publishers had with search competitors LookSmart and Google. The move brings to seven the number of partners Overture has snared since launching in this country on Australia Day. "

Google

Interesting additional facility for blogger.....Hello : Introducing BloggerBot: "Introducing BloggerBot
Now you can use Hello to publish pictures to any Blogger blog.
Meet BloggerBot.
Just use Hello to send your pictures to BloggerBot. BloggerBot will automatically resize your JPG pictures, add your captions, and publish your pictures to the Web. "

Google

Yahoo Reawakens The Paid Inclusion Debate: "The difficulty with mixed messages hasn't changed, as you can see using current material from Yahoo:
Yahoo...today announced that it has created a more comprehensive and relevant search experience for users through the deployment of its own algorithmic search technology --Yahoo Press Release, Feb. 18, 2004
Sounds like good news for searchers. But wait! What are advertisers being told about Yahoo-owned Overture Site Match, which feeds paid inclusion content into Yahoo's search results?
Eliminate guesswork: Ensure that your pages are reviewed and included in the search index quickly and refreshed frequently. No waiting for search engines to find your site or guessing which content will be included. --Overture Site Match product page, May 18, 2004
On the one hand, searchers are told that Yahoo has a comprehensive and relevant search engine, which you'd also assume means it's fresh. On the other hand, site owners are told that Yahoo's search engine apparently just guesses about what to include and may not refresh that content frequently.
Is it any wonder that Yahoo's gained bad press after unveiling its new programs? Either you have a great search engine or you don't. Trying to play it both ways simply doesn't fly...

Google certainly had a field day watching Yahoo try to justify its program. Google had just come off one of the worst periods of publicity it had ever known (this being before the Google Gmail announcement in April and major privacy concerns that have followed). Last December's upset over Google ranking changes spawned a number of anti-Google forum discussions and articles.

What on earth could save Google? Yahoo's complicated paid inclusion program came to the rescue. After weeks of having its own results questioned, Google got to sit back and watch Yahoo's program get put under a microscope.

Google also brought out company cofounder Larry Page in unprecedented fashion for major publications. Gone were the typically cautious or limited comments about the competition Google's normally given. It's search war now, the gloves are off, and by not having paid inclusion, Google seemingly occupies high ground. Page confidently shot quotes like these not just across Yahoo's bow but directly amidships:

"Any time you accept money to influence the results, even if it is just for inclusion, it is probably a bad thing." -- New York Times

"It's really tricky when people start putting things in the search results," -- Wall Street Journal

If you need a last sign of Yahoo's failure on the propaganda front, take note that Yahoo Watch has now been founded by Daniel Brandt.

Brandt created the anti-Google site of Google Watch back in 2002, when Google's incredible rise in popularity was starting to raise concerns. Until now, Brandt has been content to hold Google solely responsible for the ills that often are applicable to its competitors as well.

Brandt did always say he might target other services, if he felt they warranted it. Now Yahoo's paid inclusion programs managed to draw his ire. Google is no doubt pleased to find Yahoo is now part of the axis of search evil it formerly occupied alone....

I also expect Yahoo will likely begin fighting back against charges that it is less pure than Google by pointing out that Google's AdSense program gives Google potentially as much incentive to skew results as does Yahoo's paid inclusion program.

AdSense puts Google ads on pages across the web outside of Google. Obviously, if Google drives traffic to pages carrying its ads, the company may earn more money. It's something Google strongly denied it would ever do when AdSense was launched, a denial it repeated again recently when I looked in December at allegations of Google favoring sites with AdSense content. Nevertheless, as with Yahoo, the incentive for favoritism is there.

I'd already planned to cover the AdSense issue as part of this series. However, I was especially surprised to have Yahoo raise the issue itself, when I was discussing this series with them last week. I don't recall Yahoo employing the "Google has incentive to be bad" argument before. That's why I suspect it may emerge as a new line of defense, if paid inclusion criticisms continue.

The series will also explore the future of paid inclusion. Will the other search engines turn toward or away from paid inclusion. Will Yahoo decide it needs to do inline disclosure of paid inclusion URLs, something I certainly hope will happen.."

Google

Tricky Linkers Category: Link Building by www.e-axis-inc.com

Are you actively building your reciprocal link campaign but not seeing the results show up in the search engines? Here are a few tricks webmasters are using to cheat you out of your reciprocal link count.

The robots.txt file -- This file controls access to the search engines and directories within your site. Some webmasters are blocking the link directory they are building with the robot file what this does is allow them to build extra page rank. How does it build page rank? Google counts incoming links To the sites as "extra page rank" if you are building links That point to other domains you are sending them small amounts of page rank from your page but not receiving the page rank in return. A quick check to see if the page your url is on is being blocked would Be to examine the robots.txt file.

You can do this by typing...

www.e-axis-inc.com/robots.txt

www.anydomainname.com/robots.txt

.. as you can see e-axis-inc.com does not block any directories within its site. This ensures that all reciprocal link and all pages are spider'd by the search engines.

Following up your link campaign

After about 4 weeks, check all of your link partners to see if they have modified your link or blocked the search engines.

Another Great Link Strategy to Speed things up As you build more and more link partners, some are updated More frequently than others by the major search engines. Go to the page where your link is on your link partners

Copy the URL, down to a list in a document. Next you want to use a free service to submit your Partners pages to the search engine, J this is a rapid way to ensure your links start to show up online.

A second way to validate the quality of a link partner Is to use a free site spider service to see if you are able to view the page properly "like a search engine would" you can find a tool that will do this for you at: www.instantposition.com its completely free just register your email and start submitting your partners pages to the major search engines.

Google

Tuesday, May 18, 2004

DMNews.com | News | Article: "While Google and its gigantic stock offering draw the attention, Ask Jeeves has been content to quietly build its share of the search market...

The company's acquisition of Interactive Search Holdings, which closed May 6, doubled its share of the search market to 7 percent...

The key to Ask Jeeves' position in the competitive search market, its executives say, is that it owns its search technology. Unlike MSN, which is building algorithmic search capability, and AOL, which relies on Google, Ask Jeeves has Teoma, a Web search technology it bought for a bargain-basement price of $4.5 million in September 2001, well before the search boom.

"Index search is the great differentiator in the opportunity for value creation in this space," Ask Jeeves CEO Steve Berkowitz said in a conference call after closing the ISH acquisition.

Unlike Google, which determines search results based on link popularity, Teoma takes a community approach, judging relevance based on authorities it ties to subjects. By combining Teoma with its own roots in natural-language search, Ask Jeeves has been a leader in directly returning information to user queries through its Smart Search initiative.

With Smart Search, a query for "capital of Latvia" yields "Riga" and one for "weather in San Francisco" returns a forecast, in lieu of an index of Web pages. Ask Jeeves executives bank on steps like these to expand its search market share at the expense of rivals"

Google

Keeping Up with Yahoo!'s Advertising Options: "Keeping Up with Yahoo!'s Advertising Options"

"I've been trying to keep up with all the search engine changes, but I'm not sure what to think of Yahoo! Now you can: 1) submit for free, 2) Pay $49.99 plus $0.15 to $0.30 per click, 3) Pay for Performance (i.e., Overture), or 4) pay $299.00 to submit to Yahoo!'s Directory. I understand Yahoo! is looking for multiple streams of income, but what is the best choice for a small business, especially if it doesn't have an e-commerce site?" -- Patricia Hughes, Hughes Technology Solutions

Your questions would be answered differently for different kinds of businesses. Here's how I see it:

Submit for free (http://submit.search.yahoo.com/free/request) is a no-brainer. By all means submit your website to Yahoo!, but first make sure that both your webpages and navigation system are search engine friendly. Your traffic from "natural" or "organic" searches is the lowest cost traffic you can find, so it's well worth the effort (and expense) to optimize your site to receive it. You can learn more about how to do this in my new e-book, Dr. Wilson's Plain-Spoken Guide to Search Engine Optimization.

Overture Site Match is Yahoo!'s new paid inclusion service. They guarantee that your pages will be included within about 48 hours in the index for AltaVista, AllTheWeb, Yahoo!, and FAST, but you get no bump in rankings for your dollars. Costs for 11 or more URLs are $10 each annually, plus either 15¢ or 30¢ per click, depending upon product category. I do not recommend this unless Yahoo! isn't listing your product sales pages and you have already optimized these webpages to rank high. Even then, many companies won't find this profitable. http://smallbusiness.yahoo.com/bzinfo/prod/marketserv/searchsub.php

Overture Pay for Performance (P4P or PPC, http://wilsonweb.com/afd/overture.htm), is a very important advertising channel for businesses selling products or services on the Internet. Many small businesses are using Overture P4P and Google AdWords to make a nice living. Local business should now consider taking advantage of geo-targeting features that only show your PPC ads to those in your immediate area.

Yahoo! Directory Express Submit (https://ecom.yahoo.com/dir/express/intro/) is a bit pricey for some businesses at $299 annually. But a paid link from this high PageRanked site will boost your own site's ranking on Google, Yahoo!, and other search engines. Do it if you can afford it. Remember to get a free listing in the Open Directory Project (www.dmoz.com), though you'll need patience waiting for a volunteer editor to consider your submission.

Yahoo! Product Submit (http://smallbusiness.yahoo.com/bzinfo/prod/marketserv/prodsubmit.php). If you have consumer products, consider a paid listing in Yahoo! Shopping. You'll pay 19¢ to 50¢ per click depending upon your product category ($1.25/click for flowers and diamonds), but you'll get targeted traffic from people in a shopping mode. Remember that Google's Froogle product submissions are free and will give you a bump in Google's regular search rankings.

Google

Slashdot | Welcome to the 'Plogging' World: "Roland Piquepaille writes 'No, it's not a typo. A plog is short for 'project log' like a blog is short for 'web log.' And plogs start to be used as tools to manage projects, especially in the IT world, as discovered Michael Schrage of the MIT. He reports his findings in an article published by CIO Magazine, 'The Virtues of Chitchat.' Schrage found that if plogs are not really commonplace, they're not exactly rare. And they are even used to manage large IT projects, such as ERP rollouts. I totally agree with him that a plog is of great value to integrate people in a team or to keep track of the advancement of a project. And you, what's your view? If you're a project manager, do you use a plog for better control?"

Welcome to the 'Plogging' World!: "So plogs can and should be different from blogs. Different organizations have the opportunity -- I would now say the obligation -- to explore how best to marry this medium of expression with the insatiable need for better managing communication, coordination and collaboration with IT and its clients. Frankly, I think plogs -- like project leadership -- represent an investment in professional development. That is, if a developer or manager or customer support rep can produce plogs that attract interest, raise awareness and foment change -- well, that's a skill that deserves recognition and reward."

Google

Persuasive writing for your promotional marketing campaigns
This comment was so simple yet so ingenious I've already seen several other attendees refer to it in their own e-zines. A 33 year old marketer who's already banked millions from the Internet said, "Don't worry about being a perfectionist or you're screwed." You can make plenty of money on the Internet without being perfect. He wasn't saying put out junk. Just don't be so hard on yourself."

Google

"Cheap" and "Free" Keywords: Not Always a Bargain: "Cheap' and 'Free' Keywords: Not Always a Bargain"

There's no loyalty from shoppers who buy only from the lowest-price provider. The moment a competitor lowers its price below yours, the customer leaves. There's such cost and effort associated with acquiring customers that in many service industries, customers only become profitable in the relationship's second or third year.

Spending acquisition dollars to attract customers who defect the moment a cheaper option is available is short-term thinking.

In an ongoing paid search advertising campaign, a large insurance company expressed interest in appending some of its keywords and keyword phrases with the words "cheap" and "free," as in "cheap health insurance" and "free insurance quote."

We tested these phrases and noticed some trends. Keywords and keyword phrases with solid conversion rates plummeted when "cheap" or "free" was appended onto them.Specifically, conversions were 25 percent less likely when the word "cheap" was appended to any of the company's search terms.This occurred in the insurance industry. Other verticals may experience different results.

A second phenomenon is the relationship between longer keyword phrases and higher conversion rate. We expect a specific query to convert at a higher rate than a broad keyword would. Not everyone grasps the greater benefit: More specific, longer phrases are often less expensive than broad keywords. Again, the plural of "anecdote" is "data."

The longer the keyword phrase, the higher the conversion rate. But often, longer keyword phrases aren't queried with the same frequency as shorter phrases. The trick is to identify as many of the longer phrases as possible. Google's new AdWords Automater system could be a powerful tool in increasing marketers' ability to identify more long, specific keyword phrases that can be purchased at lower rates to increase volume on these higher-converting, less-expensive keyword phrases.

Back to that earlier point: Though appending "cheap" or "free" to a keyword phrase makes the phrase more specific, consider the long-term benefit of attracting and paying for clicks from bargain-hunting customers.

With only anecdotal evidence and gut instinct on which to base this hypothesis, I'd posit even if customers searching on "cheap"- and "free"-related queries were attracted at a significantly lower CPC and converted profitably on their first site visit, they're more likely to defect when your competitor targets that phrase and offers a lower price.

The answer in SEM, as with all marketing, is test and measure.

Google

Is Content King of Effective Online Advertising?Which works best: targeting online advertising according to content, or tracking visitors' online activities and following their paths with relevant creative? A study conducted this month by Tacoda and iVillage points to the latter.

Google

CBS MarketWatch - Business News - Financial Information - Investment Tools

Suitor for Lycos?

It's been a little more than two weeks since reports surfaced that Terra Networks (TRLY) had retained an investment bank to help value its Lycos portal business for sale. Alan Meckler, the CEO of JupiterMedia (JUPM), thinks he knows a perfect suitor. "My guess: Google goes for the kill shot and adds a Yahoo-like service to its business line," the irrepressible Internet entrepreneur wrote in his Web log. "Getting the Lycos traffic and distribution will add a terrific dimension to Google and make it more of a media company."

Google

Semantic Web to Take Center Stage at WWW2004: "Semantic Web to Take Center Stage at WWW2004 "
WWW2004 Program Sessions: "WWW2004 Program Sessions"

Google

National Post: "About 42 per cent of U.S. Web users went to Google's search engine in March, compared with 31 per cent for Yahoo and 29 per cent for MSN, according to Nielsen/Net Ratings.

Microsoft is racing to play catch-up, a typical approach whenever it perceives a threat, said David Smith, vice-president of Internet strategy with Gartner Group.

Microsoft plans to unveil its own Internet search technology this year after seeing what MSN director Lisa Gurry termed the 'amazing' consumer demand and the moneymaking potential.

At first, Microsoft plans to use its new technology only for Internet searches based on relevance, replacing Inktomi, now owned by rival Yahoo! Inc. Microsoft will continue to work with Overture Services, another Yahoo subsidiary, for the paid listings that run alongside regular search results.
Microsoft also is gradually unveiling a news search product, called NewsBot, similar to Google's news offering, which uses software to sort news stories based on relevance. Other technologies being developed include BlogBot, to search Web journals, and AnswerBot to better answer questions posed in plain English.

Microsoft also says that search will be a key component of its next version of Windows, dubbed Longhorn, which isn't expected until at least 2006. "

Google

DoubleClick jumps into search | CNET News.com: "Online ad company DoubleClick said on Monday it has bought search-engine marketing specialist Performics for $58 million in cash, in a bid to profit from the fast-growing sector.
New York-based DoubleClick sells technology to manage, track and deliver Web advertisements and e-mail promotions. With the acquisition of Chicago-based Performics, it plans to offer advertisers added tools to manage marketing campaigns conducted with search engines, an industry expected to be worth up to $2.1 billion in 2004. In addition, it will provide software to analyze affiliate-marketing campaigns using Performics technology...

The Performics product will enable DoubleClick to offer a product that meets large advertisers' need for software to simplify the running of multiple ad campaigns on search engines. Marketers typically bid for hundreds or thousands of keywords on search engines such as Google and Yahoo to appear in query results for those terms. Keeping tabs on bid amounts and the effectiveness of any given keyword is the specialty of Performics' software.

After the deal closes, DoubleClick will offer a new product called DART Search, which will be integrated with its existing ad-management and ad-reporting tools for marketers"

Google

Monday, May 17, 2004

FTC Letter in full
Commercial Alert Letter: "UNITED STATES OF AMERICA
FEDERAL TRADE COMMISSION
WASHINGTON, D.C. 20580
June 27, 2002

Mr. Gary Ruskin
Executive Director
Commercial Alert
3719 SE Hawthorne Blvd.
Suite 281
Portland, OR 97214
ReComplaint Requesting Investigation of Various Internet Search Engine Companies for Paid Placement and Paid Inclusion Programs

Dear Mr. Ruskin:
This letter responds to the July 16, 2001 complaint filed by Commercial Alert requesting that the Federal Trade Commission ('FTC' or 'Commission') investigate whether Alta Vista Co., AOL Time Warner, Inc., Direct Hit Technologies, iWon, Inc., Looksmart, Ltd., Microsoft Corp., and Terra Lycos S.A. (hereinafter, 'named search engine companies') are violating Section 5 of the Federal Trade Commission Act ('FTC Act'), 15 U.S.C. � 45(a)(1),(1) by failing to disclose that advertisements are inserted into search engine results lists.
I. Overview
Your complaint alleges that when search engines include Web sites in search results lists, on the basis of 'paid placement' and 'paid inclusion,' such search results are advertisements. Your complaint contends that 'without clear and conspicuous disclosure that the ads are ads,' such 'concealment may mislead search engine users to believe that search results are based on relevancy alone, not marketing ploys.' After careful review, the staff of the Bureau of Consumer Protection has determined not to recommend that the Commission take formal action against the search engine companies listed in your complaint at this time. That determination should not, however, be construed as a determination by either the Bureau of Consumer Protection or the Commission as to whether or not the practices described in your complaint violate the FTC Act or any other statute enforced by the Commission.

Although the staff is not recommending Commission action at this time, we are sending letters to search engine companies outlining the need for clear and conspicuous disclosures of paid placement, and in some instances paid inclusion, so that businesses may avoid possible future Commission action. In addition, this response to your complaint will be placed on the Commission's public record and on the FTC's Web site.(2) For the most part, the staff believes that while many search engine companies do attempt some disclosure of paid placement, their current disclosures may not be sufficiently clear. The staff also believes that, depending on the nature of the paid inclusion program, there should be clearer disclosure of the use of paid inclusion, including more conspicuous descriptions of how any such program operates and its impact on search results. As a general matter, clear and conspicuous disclosures would put consumers in a better position to determine the importance of these practices in their choice of search engines to use.

II. Paid Placement and Paid Inclusion

In conducting its review, the staff considered "paid placement" to be any program in which individual Web sites or URLs can pay for a higher ranking in a search results list, with the result that relevancy measures alone do not dictate their rank. The staff considered "paid inclusion" to be any program in which individual Web sites or URLs are included in a search engine's index, or pool, of sites available for display as search results, when that Web site or URL might not otherwise have been included, or might not have been included at a particular point in time, but for participation in the paid program.

A. Paid Placement

Paid placement programs can take many forms. Search engines may operate their own paid placement programs or obtain search results from third parties who in turn operate paid placement programs. The staff agrees that search engines should clearly and conspicuously disclose that certain Web sites or URLs have paid for higher placement in the display of search results. This information is likely to be important to consumers,(3) who otherwise might believe that the sites placed higher up in the list were independently chosen and ranked as being more relevant to the consumer's search query than those search results placed further down in the list. The failure to disclose paid placement adequately within search results deviates from the established deception principle of clearly distinguishing editorial content from advertising content.(4) The purpose of such a demarcation is to advise consumers as to when they are being solicited, as opposed to being impartially informed.

Because search engines historically displayed search results based on relevancy to the search query, as determined by algorithms or other objective criteria, the staff believes that consumers may reasonably expect that the search results displayed by individual search engines are ranked in accordance with this standard industry practice - that is, based on a set of impartial factors. Thus, a departure from the standard practice, such as a search engine's insertion of paid-for placements in the search list, may need to be disclosed clearly and conspicuously to avoid the potential for deception.

Thus, any Web sites or URLs that have paid to be ranked higher than they would be ranked by relevancy, or other objective criteria, should be clearly labeled as such using terms conveying that the ranking is paid for. In the staff's view, to avoid deception such labels need to convey that the sites listed are placed higher, or otherwise presented more prominently, because they have paid for their ranking or position, rather than solely based on some objective criteria relating to the probable relevance of their content to any particular search request.

Paid placement listings may also be denoted by segregating them from non-paid listings. Each separate set of paid placement listings should be clearly labeled as such so they can be easily distinguished from other types. Of the 12 search sites owned or operated by the 7 named search engine companies, 11 segregate paid ranking results by placing them above the non-paid results or prominently elsewhere. Many of these sites appear to be headed in the right direction, using terms such as "Sponsored Links" or "Sponsored Search Listings" to denote payment for rankings. In some cases, these sites display more than one set of paid placement listings, and these additional listings are labeled using terms such as "Recommended Sites," "Featured Listings," "Premier Listings," "Search Partners," "Provided by the [________] Network," or "Start Here." Other sites use much more ambiguous terms such as "Products and Services," "News," "Resources," "Featured Listings," "Partner Search Results," or "Spotlight," or no labels at all.(5) To avoid deception, these sites should be labeled to better convey that paid placement is being used.

The staff is encouraging search engine companies to make changes to their paid-ranking search results to clearly delineate them as such, whether they are segregated from, or inserted into, non-paid listings. Factors to be considered in making such a disclosure clear and conspicuous are prominence, placement, presentation (i.e., it uses terms and a format that are easy for consumers to understand, and that do not contradict other statements made), and proximity to a claim that it explains or qualifies.

B. Paid Inclusion

Paid inclusion can take many forms. Examples of paid inclusion include programs where the only sites listed are those that have paid; where paid sites are intermingled among non-paid sites; and where companies pay to have their Web sites or URLs reviewed more quickly, or for more frequent spidering of their Web sites or URLs, or for the review or inclusion of deeper levels of their Web sites, than is the case with non-paid sites. As with paid placement, search engines may operate their own paid inclusion programs or obtain search results from third parties who in turn operate paid inclusion programs.

To the extent that paid inclusion does not distort the ranking of a Web site or URL, many of these programs provide benefits to consumers, by incorporating more Web sites - or content - into an individual search engine's database than might otherwise be the case. This can give consumers a greater number of choices in search results lists.(6)

In other instances, the intermingling of non-paid Web sites with paid-inclusion Web sites in the search database may cause consumer confusion and mislead consumers as to the reasons for a Web site's or URL's inclusion in the search results. If the program distorts rankings, the program or its impact on rankings should be prominently disclosed. And certainly, if all Web sites included in a search guide or a search engine's database have paid to be included, so that the search engine is essentially an advertising medium, that fact should be disclosed adequately to avoid deception. Accordingly, the staff is encouraging search engines that offer paid inclusion programs to clearly describe how sites are selected for inclusion in their indices.

In short, through the use of clear and conspicuous disclosures, consumers should be able to easily locate a search engine's explanation of any paid inclusion program, and discern the impact of paid inclusion on search results lists. In this way, consumers will be in a better position to determine whether the practice of paid inclusion is important to them in their choice of the search engines they use. Currently, although certain of the named search engines do, in fact, use paid inclusion, in the staff's view none of them adequately discloses its usage or offers clear and conspicuous explanations of its impact on search results. In the staff's view, labels such as "Web Directory Sites," "Results," "Matching Sites," and "Reviewed Web Sites" may not clearly convey that certain sites or URLs in the search result list, or perhaps all of the sites or URLs in the search database, are participants in a paid inclusion program, rather than being included based on some other criteria that may not involve payment.(7)

III. Conclusion

In short, the staff is recommending that all search engine companies(8) review their Web sites and make any changes necessary to ensure that:

any paid ranking search results are distinguished from non-paid results with clear and conspicuous disclosures;
the use of paid inclusion is clearly and conspicuously explained and disclosed; and
no affirmative statement is made that might mislead consumers as to the basis on which a search result is generated.
To the extent that search engine companies provide search results to third-party Web sites, including other search engines or guides, we are encouraging the companies to discuss with the third-party Web sites whether the above criteria are being met with respect to any supplied search results that involve a payment of any kind for ranking, insertion of paid results into unpaid results, or any pay-for-inclusion program.

The staff recognizes that search engine companies' business models vary and that there is a need for flexibility in the manner in which paid placement and paid inclusion are clearly and conspicuously disclosed. We encourage all companies making disclosures online to review and implement guidance provided in the Commission's business education piece, Dot Com Disclosures: Information About Online Advertising, which discusses how to make clear and conspicuous disclosures online.(9)

We appreciate your bringing this matter to our attention. Complaints from groups such as yours are a helpful means of reviewing possible unfair or deceptive practices, and we hope you will continue to bring to our attention any practices that you believe may violate the FTC Act.

Very truly yours,

Heather Hippsley
Acting Associate Director
Division of Advertising Practices

Attachment

Endnotes:

1. Section 5 of the FTC Act prohibits unfair or deceptive acts or practices in or affecting commerce. The Commission will find deception if there is a representation, omission, or practice that is likely to mislead the consumer acting reasonably in the circumstances, to the consumer's detriment. See FTC Policy Statement on Deception, appended to Cliffdale Associates, Inc., 103 F.T.C. 110, 174 (1984).

2. Your complaint has become public by virtue of its placement on your Web site, www.commercialalert.org.

3. Currently, there are very few studies on this subject. A Consumers Union national survey found that 60% of U.S. Internet users had not heard or read that certain search engines were paid fees to list some sites more prominently than others in their search results. After being told that some search engines take these fees, 80% said it is important (including 44% who said it is very important) for a search engine to disclose, in its search results or in an easy-to-find page on its site, that it is being paid to list certain sites more prominently. If clearly told in the search results that some sites are displayed prominently because they paid, 30% said they would be less likely to use that search engine, 10% said more likely, and 4% said don't know/refused. Consumers Union also reported that "given the complicated situation, 56% say it would make no difference to them." It stated that the "combination of users' low level of knowledge of search engine practices and their strong demand that search engines should come clean leaves users splintered about how to react." See "A Matter of Trust: What Users Want From Web Sites," www.consumerwebwatch.com/news/report1.pdf (Apr. 16, 2002). A recent BBC-commissioned survey found that 71% of U.K. users were unaware that some search engines let advertisers pay to get more prominent positions in search results. See, e.g., "BBC Launches its Non-Commercial Search Engine in Response to 'Tainted' Results," VentureReporter.net (May 2, 2002).

4. The Commission has brought actions against infomercial producers for failure to disclose that a television show was not an independent program but was, instead, a paid commercial advertisement. See, e.g., National Media Corp., 116 F.T.C. 549 (1993) (consent order). Similarly, the Commission alleged as deceptive the use of misleading formats that made an advertisement appear to be an independently written article published in a magazine. See, e.g., Georgetown Publishing House Limited Partnership, 122 F.T.C. 392 (1996) (consent order).

5. We note that several search engines not named in the complaint also use labels such as "Featured Search Results" and "Premier Listings" to denote paid-for higher rankings; some, however, provide no indication at all that certain sites have paid for their higher positions.

6. Indeed, if the pay-for-inclusion mechanism does not distort the placement criteria, this fact might be a positive selling point for search engines.

7. Similarly, several of the named search engines that obtain listings from third parties using paid inclusion programs display the third-party's logo or terms such as "Provided By ___" or "Powered By ___" at the bottom of their search results lists. The staff believes that, as disclosures, these measures are not conspicuously located nor do they adequately explain the existence of paid inclusion or its impact on the search results list.

8. This would include the named search engine companies, and other companies providing similar Internet search services to consumers, as well as meta search engines that submit simultaneous search queries to (and display results from) numerous third-party search engines.

9. Dot Com Disclosures: Information About Online Advertising is available on the FTC Web site at www.ftc.gov/bcp/conline/pubs/buspubs/dotcom/index.pdf.
"

Google

Saturday, May 15, 2004

This guy is a major search engine spam reporter...well worth keeping an eye on his research & articles...Benjamin Edelman - Home: "my economics Ph.D. dissertation -- a major undertaking, of course, but a project I'm particularly excited about. I can't yet say precisely what I'll be writing about -- but I wouldn't be surprised if my articles included analysis of the domain name industry, of pay-per-click advertising, and perhaps even of online travel services"

Google

Yahoo Versus Google.... Net portal wars | CNET News.com

Week in review: Net portal wars | CNET News.com: "Yahoo's one-two combination began when the Cable News Network's online arm tapped it to replace Google to power its search results. The switchover, which went live Wednesday, means that Yahoo will provide algorithmic and paid results whenever CNN users conduct a search query on the site.
Yahoo will also begin offering 'virtually unlimited storage' for its paid e-mail customers and will upgrade free users to 100MB from it current 4MB, in a challenge to Google's Gmail service. The upgrade is part of an overall enhancement for Yahoo Mail that will launch this summer"

Spam ban:
Search engines delete adware company
| CNET News.com
Search engines delete adware company | CNET News.com: "Yahoo and Google have disabled links to controversial adware maker WhenU after the company was accused of engaging in unauthorized practices aimed at boosting its search rankings, WhenU's top executive confirmed Thursday.
The practices came to light following an investigation by antispyware crusader Ben Edelman, a Harvard student who found that the company used a technique known as 'cloaking' to dupe search engines into favorably listing decoy Web pages that direct people to other destinations, once they click on the link...

WhenU's efforts to manipulate search engine results are aimed to offset negative perceptions about the company, Edelman said.

His investigation found that WhenU created Web pages that borrowed from news articles published by various news Web sites about the company in a bid to drive up its rankings on Google and Yahoo. One such article, published by CNET's News.com, was used to cast a favorable light on WhenU in a court opinion about its pop-up ads. That Web page, among several similar pages, was fed to Google and Yahoo's search crawlers, which are used to index the Web.

Once those pages were indexed and listed in search results for WhenU, the listings were used to redirect people who clicked to other WhenU pages.

"WhenU turned to the Web to try to clean up its image, after facing widespread criticism from consumers, businesses and policy makers. But by resorting to misleading and prohibited methods, WhenU crossed a serious ethical line," Edelman wrote in an e-mail interview
"

Google

Saturday, May 08, 2004

Measure & trackCan you count on search engines? 6/May/2004

Search columnist Mike Grehan says the accountability of search marketing is actually far from convincing.

Features: "Last year, in Santa Barbara, I was explaining a little about search engines and what we in the search engine marketing field, have to measure and count and analyse and...

why is it, in our industry, we always seem to be dealing more with possibility than we ever do with probability?

Well, as I've posed my own question - I'll try and answer it with a few of my own passing thoughts.
First, we're almost always dealing with what could be termed as "dodgy data".
And second, we're desperate to be able to give our clients any kind of measure or metric as to what they're getting for their cash.

I mean, you'd think with PPC it would be a fairly safe bet. You spend this much and you make that much. Bingo: return on investment. Or is it?

On a train journey to London, I once sat opposite the financial director of a large UK organisation. What he classed as ROI and what I classed as ROI were not the same thing at all. Believe me. As far as he was concerned marketing is actually an expense and not an asset. So it all depends on who is looking at the figures and what figures they're looking at and how those figures are presented... I think you get my drift."

Google

Friday, May 07, 2004

News CBS MarketWatch.com Member Sign-UpWASHINGTON (CBS.MW) -- More drumbeats about a deal for America Online are being heard. On the heels of a report from the U.K. this week that private equity firms were interested in buying Dulles, Va.-based AOL, a published report out Thursday suggested that Microsoft (MSFT) or Yahoo (YHOO) could be interested.

Google

1,000s & 1,000s having similar problems......Yahoo Updating: "Posted: 04/27/2004 06:27 am

Thank God. I am not the only one having problems with yahoo. There is a huge drop in traffic for me site about 40% less traffic. Since yahoo, alltheweb are same so everything is also gone on alltheweb also. "

Google

The Web's 'Unique' Problem:

45% said they clear cookies at least once a month. The average was once a week. "For work and home, this behavior was consistent," Harmon explains. Many may not be using browser tools to remove cookies, but rather "spyware" and "computer hygiene" tools that have become more popular as the public worries about unauthorized snooping on their online activities.

The implications are obvious. "If a user is clearing cookies once a week from home, the Web sites he or she visits are counting many people at home as four different unique users over the course of a month," Harmon says. "The unique user numbers are vastly inflated, but at the same time, frequency of visiting is undercounted."

Google

BW Online | May 6, 2004 | Web Search for Tomorrow: "the arena that Google dominates is now being targeted by all comers. From bigwig rivals such as Microsoft (MSFT ) and Yahoo! (YHOO ) to startups like Vivisimo and ChoiceStream, scores of companies are spending billions of dollars trying to come up with new and better ways to help people find information. The flurry of research not only poses a potential threat to Google's dominance, over the next few years it could also revolutionize how users search the Web"

Personalization: honing results to fit a searcher's location or preferences. An astronomy buff who searches for "Saturn" would get results about the planet, for example, not the car

Trends: search engines such as Google provide a current snapshot of information and views on specific topics available on the Web. But there's no reliable way to discern how that snapshot changes over time.

Everywhere:Probing the Internet is valuable. But much of what a user wants may be tucked away elsewhere -- stashed in a Word or PowerPoint file on a hard drive, or in e-mail archived on a server somewhere else. Grabbing such data isn't easy right now, but companies ranging from Lycos to Microsoft are exploring ways to dig out information from these sources with a single search tool.

Microsoft, for example, has a two-year-old project, dubbed Stuff I've Seen, that creates a searchable index of every last word that appears on a person's PC screen -- from work files to appointments to Web pages.

Better Results:The average search query contains 2.5 words, leaving plenty of room for interpretation. As a result, searches typically turn up hundreds of links, many of them irrelevant. A handful of startups, from Vivisimo to iXmatch Inc., are using so-called clustering technology that organizes several hundred search results into subject-specific folders.

Google

Wednesday, May 05, 2004

Could be useful...What's on your mind re: Web Analytics? : E-consultancy.com: "'Emetrics Summit' (24-26 May, Four Seasons Hotel) "Jim Sterne, he of international web metrics book-writing, speaking and consulting fame, is about to conduct his Emetrics Summit in
London (24-26 May). In the run up he's asking what's bothering you
in the area of measuring website success? What would you like to ask
your web analytics peers?

Google

Pandia cloned by copyright thief: "What do you do when someone copies all the content of your site over to another domain? That is exactly what happened to Pandia this week. Read our article about site theft and copyright infringement:

Google

An in depth analysis of SEO and accessibility.
Search Engine Optimisation and Accessibility: "The following chapters describe how search engines algorithms, and attempts to address them through search engine optimisation as marketing technique creates new opportunities for making web content accessible - and equally, how improving the accessibility of a site can help maximise both it's search engine promotion potential and increase user conversions and ROI."

Main points:

Provide a text equivalent for every non-text element (e.g., via "alt", "longdesc", or in element content). This includes: images, graphical representations of text (including symbols), image map regions, animations (e.g., animated GIFs), applets and programmatic objects, ASCII art, frames, scripts, images used as list bullets, spacers, graphical buttons, sounds (played with or without user interaction), stand-alone audio files, audio tracks of video, and video.

Ensure that dynamic content is accessible or provide an alternative presentation or page. Until user agents can automatically read aloud the text equivalent of a visual track, provide an auditory description of the important information of the visual track of a multimedia presentation.

Until user agents render text equivalents for client-side image map links, provide redundant text links for each active region of a client-side image map.

Ensure that all information conveyed with colour is also interpretable without colour, for example from context or mark-up.

Ensure that foreground and background colour combinations provide sufficient contrast when viewed by someone having colour deficits or when viewed on a black and white screen.

Search engine optimisation techniques make good use of the separation of content and presentation. One approach that shows excellent results in search engine optimisation campaigns is the use of layouts controlled by Cascading Style Sheets (CSS). However, organize documents so they may be read without style sheets. For example, when an HTML document is rendered without associated style sheets, it must still be possible to read the document.

For data tables, identify row and column headers. For example, a table that lists UK tour businesses may have, as column header, "Tour operators". If this cell is not differentiated from others, then it is simply one phrase amongst many. However, if the cell is marked as a column header, this clearly communicates to search engines that this page is a resource about "Tour operators" and not just a page that mentions the phrase incidentally. As such, this makes it a powerful search engine promotion tool for describing structured data.

Provide summaries for tables. The CAPTION element and the "summary" attribute are intended to describe the purpose and content of a table. This is particularly useful when a table's content can only be properly understood visually rather than semantically.

Ensure that pages are usable when scripts, applets, or other programmatic objects are turned off or not supported. If this is not possible, provide equivalent information on an alternative accessible page.

Frames are essentially deprecated and search engine promotion professionals will advise against their use for solid optimisation reasons. However, if frames must be used then the best available SEO strategy available is to enrich the frameset with descriptive content for search engines through the TITLE tag and the content of the NOFRAME element.

Do not to use link text such as "click here", "read more" or similar variants. Using anchor text, (sometimes called link text) to accurately and specifically describe the page to which it links is a cornerstone of effective search engine optimisation.

Provide information about the general layout of a site (e.g., a site map or table of contents).

Conclusion

It seems unlikely that all accessibility guidelines can or will be included in search engine algorithms. There is, however, a clear tendency that accessibility as a whole will be factored in more strongly in the near future, and some guidelines will almost certainly be considered for inclusion.

Since, it is impossible to second guess which they will be, the sensible, no-risk strategy is to improve accessibility across the board.

This article has been based on the Search Engine Optimisation and Accessibility guide. A free PDF version is also available.

Google

Monday, May 03, 2004

Yahoo Updating: "... I am not the only one having problems with yahoo. There is a huge drop in traffic for me site about 40% less traffic. Since yahoo, alltheweb are same so everything is also gone on alltheweb also. "

Google

Yahoo! Dominating Top PPC Phrases: "Yahoo in Europe have actually hired a 3rd party SEM company to manage their PPC campaigns. So, they are using bid management tools. As a portal, I expect them to compete in PPC in more and more markets (unfortunately)! "

(best forum for yahoo at moment....)

Google

Mike Grehan's eMarketing News - Internet Marketing Tips: "Yahoo Search Manager Spills the Beans... Mike Grehan in conversation with... Jon Glick."

in-depth look into the new Yahoo!... also.... a virtual best practice guide to search engine marketing

History & future:
two phases in search... The first phase was all about what was on the page.

The second generation of engines started to look at what it was they could find out about that page by looking at what else there was on the web that gave more information about it. The directory listings, the connectivity, the anchor text etc. And we're still in phase two.

For me, and this is me speaking personally, the next phase will be where you're able to take into account information about the user. And of course local, because local search is a subset of personalisation. For local to really work, you need to know where the person is. So, the issue of: "I'm number one for this keyword"... may not exist at all in a few years. You know, you'll be number one for that keyword depending on who types it in! And from where and on what day... and... It is going to get more complex than something that can simply be summed up in a ranking algorithm, let alone how many checks somebody has on a toolbar.


NEW YAHOO:

This week saw the roll out of Site Match and I think it surprised a lot of people that there wasn't just a straight flick of the switch to Inktomi results with its old version of paid inclusion. That's not what happened, so do you want to bring me up to speed with the new model?

o Jon:

There are three components to the Site Match program.

The first is just, as you said, Site Match and that's the basic per URL submission. It's a subscription charge plus a cost per click. We do this for a number of reasons. If you take a look at what you would have had to have done to get into all the individual subscription programs, Alta Vista Express Inclusion, Inktomi Site Submit etc. You'd generate a subscription fee of over 150 dollars. But now the base fee, for the first year is 49 dollars and then drops for subsequent URL's. So it's much more economical. Especially for a small site that wants to get across a large network. Also, it means that people who are going into a category where they're going to generate a lot of traffic where there's very high value, they have a chance to do it on an ROI basis which they can measure. So it's a more tuned program that we're offering.

Site Match Xchange. And that's a similar program to Public Site Match, but it's for the commercial providers. It's an XML feed on a cost per click basis, very similar to what people were used to with Alta Vista and Trusted Feed as well as Index Connect. In addition, I have to mention that as always, about 99% of our content is free crawled in. And there is a free submission option now which covers the entire Yahoo! network...pages which are included in the Site Match Xchange program have to have unique titles and they have to have meta data

The Yahoo! directory is there for the different ways that people decide to look for information on the web. Some people like to parse a hierarchy, some people want to find other sites that are related within a certain category. And other people take the more direct route of: "I know what I want, I know the keywords..." and they just go directly to the search...The main reason that the Yahoo! directory exists is not to create connectivity or do anything specifically for Yahoo! search. The directory exists as a separate way for people to find things at Yahoo! Here you're dealing with several million pages instead of billions...

Is there a similar kind of relationship between the Yahoo! directory and Yahoo! search?

o Jon:The way that I would classify it is, that our relationship with the Yahoo! directory is very similar to that which we have with Open Directory. We also have a relationship with Open Directory Project. The way that we look at it for Yahoo! search, with all of its comprehensiveness and quality content is that, if we can find that somewhere, whether it's with a Yahoo! property or a third party, we want to have that content, we want to have that information and we want it reflected in the Yahoo! search index...But, you know, within the main search results, everyone is treated equally.

So, it's a wise decision to check in the Yahoo! database to see if you're already in there before you start thinking about subscriptions. But there may be some businesses who get the idea that, even if they are in the index, they may do better if they subscribe. You know pay to play. Are they likely to see any further benefit in doing that?

o Jon:

If by benefit you mean ranking - no there's not. It's an inclusion program.
It is just about inclusion. It gives us an opportunity to use resources to go through and give them an editorial review of their site and puts them on a one-to-one relationship with the folks at Yahoo! And if you go to Site Match Xchange then you get some good customer service support. It's not going to do anything to influence their ranking. But let's take an example of say, a travel company. The Yahoo! Slurp crawler typically is going to come around and visit a site every three to four weeks. If you're a travel company... two weeks ago you wanted to sell Mardi Gras Getaways. But that's finished and nobody's buying those breaks now. It's Spring breaks for college students maybe. Now if your content changes that dramatically, having us come back and crawl your site every 48 hours may have a significant impact on your business. If you have a page which doesn’t change much, like consumer electronics... standard web crawl may be fine. There's a guy who came to see me earlier and he's doing an art exhibit and they won't have the pages ready until a few days before they're in each city. So waiting for the free crawl to come around may mean that they're not in when they need to be. It is an additional service and if it makes sense for people then they're welcome to take advantage of it. If they're happy with it and they're positioned well and have the crawl frequency, then use it. People who don't use the program will never be disadvantaged in the rankings as compared to other people who do.

o Mike:

Site Match Xchange is for sites with more than 1000 pages yes? Or is that 2000... Whatever... Is that when it starts to make sense to look at an XML feed when you're in the thousands like that?

o Jon:

That makes sense, but it may actually make sense before you get to those figures. It may make sense with 500 pages if you go through a reseller. The other thing with the XML feed is it does allow people to target things more specifically. We do an editorial review of all those XML feeds, and one of the reasons is, as you say, people are giving us meta data. And we do want to make sure that the meta data they're giving us corresponds to what the users expectations are...


meta keywords are back again! After all that time away, now they're alive and well at Yahoo! search...

o Jon:

Yes we do use meta keywords. So let me touch on meta tags real fast. We index the meta description tag. It counts similar to body text. It's also a good fallback for us if there's no text on the page for us to lift an abstract to show to users. It won't always be used because we prefer to have the users search terms in what we show. So if we find those in the body text we're going to show that so that people can see a little snippet of what they're going to see when they land on that page. Other meta tags we deal with are things like the noindex, nofollow, nocache we respect those. For the meta keywords tag... well, originally it was a good idea. To me it's a great idea which unfortunately went wrong because its so heavily spammed. It's like, the people who knew how to use it, also knew how to abuse it! What we use it for right now is... I'd explain it as match and not rank. Let me give a better description of what that really means. Obviously, for a page to show up for a users query, it has to contain all the terms that the user types, either on the page, through the meta data, or anchor text in a link. So, if you have a product which is frequently misspelled. If you're located in one community, but do business in several surrounding communities, having the names for those communities or those alternate spellings in your meta keywords tag means that your page is now a candidate to show up in that search. That doesn't say that it'll rank, but at least it's considered. Whereas, if those words never appear then it can't be considered.

o Mike:

So, the advice would be to use the meta keywords tag, as we used to do back in the old days, for synonyms and misspellings...

how many words in the title tag, how many characters...[laughs ]

o Jon:

We typically show, roughly 60 characters. That's the maximum we'd show. I'm not a professional copywriter, so I can't tell you "is short and punchy better than lots of information..." Individual sites have different kinds of content and they have to make their individual choice. For example, at Yahoo! sports, we want a very concise title tag. Somebody searching for the New England Patriots for Instance, a title like: New England Patriots on Yahoo! Sports. That's probably all we need as a title for a page that has that information. For other people if they're selling... a... Palm Pilot, well they may want to put in a title that says: ‘50% off the new Palm Zire X 1234’ and put the name of the store as a longer title may make more sense for them. Again, they have to depend on their copywriters to advise them what works best for clicks and conversions. So we'll index all of the title, but we'll only display 60 characters. You don't want to go past that because you don't want dot, dot, dot at the end of your title.

Let's talk Spam! Of course it's a huge problem with search engines. People who are creating web pages in the industry worry so much about what they're doing with the pages and how they're linking and submitting... and will I get banned... I get asked a lot of questions like: "If I link to my other web site will they know it's mine and ban me?" Or: "My hotel is in New York, New York, will I get banned for keyword stuffing?" Crazy worries. I guess for most of the smaller businesses which aren't up to speed with search engine optimisation, they hear a lot of propaganda which worries them. But at the other end of the scale, I tend to hear more from you guys at the search engines about the activities of less ethical affiliate marketers out there. Now those guys certainly live by their own rules. How do you deal with it?

o Jon:

Well let me just say first that, in that sense Spam has gotten a lot better over the years. You don't really much have people trying to appear for off topic terms as they tended to. You now have people who are trying to be very relevant. They're trying to offer a service, but the issue with affiliate Spam is that they're trying to offer the same service as three hundred other people. And the way we look at that is... we look at that the same as we look at duplicate content. If someone searches for a book and there are affiliates in there, we're giving the user ten opportunities to see the same information, to buy the same product, from the same store, at the same price. If that happens, we haven't given our user a good service or a good experience. We've given them one result. So we are looking at how we can filter a lot of this stuff out. There are a lot of free sign up affiliate programs. They've pretty much mushroomed over the past few years. The plus side is, they're on topic. They're not showing up where they shouldn't... it's the other way... they're showing up too much where they should [laughs] We look at it like this: what does a site bring to the table? Is there some unique information here? Or is the sole purpose of that site to transact on another site, so that someone can get a commission... if that's the case, we'd rather put them directly in the store ourselves, than send them to someone else who's simply telling them how to get to the store.

o Mike:

You guys must get Spam reports the same as all the other engines. So when somebody does a search on a particular product and it turns up that there are ten affiliates in there, whether they're Spamming or not, it's likely that the affiliates could be turning up before the merchant ever does. If you get a high level of that occurring, do you ever go back to the merchant with some feedback. You know, say like, guys do want to optimise your web site or just do something about your own ranking?

o Jon:

We do actually talk to a lot of companies. We obviously have a relationship with many of them through the various Yahoo! properties. Different companies often take a different tack. For instance, a company which has been very, very good on listening to us is eBay. I have to say is a company which has been very good at working with us and listening to us on the affiliate issue. Their feeling is really twofold: One is, the people that are confusing the results in the search engines are the same people who are doing things that they don't like on eBay. And for them they tend to see bad actors in one space and bad actors in another. The other thing, of course, is if you have someone who is using a cloaked page, and so, to a search engine it's a huge bundle of keywords and massive interlinking of domains on different IP's and for a user coming in with IE 5, it's an automatic redirect to pages on eBay... they know that the user doesn't think: "Oh it's an affiliate Spammer. The perception for the user it's simply this: eBay tricked me! There's a link that I clicked that said "get something free" I clicked it and ended up on eBay. And they wonder why eBay would do that to them. And they know that those things hurt their brand. So that's why they have been very proactive in working with us to ensure that those kind of affiliates are not part of their program. But... some other merchants may look at it and say: since we're paying on a CPA (cost per acquisition) basis we're actually indifferent as to how that traffic comes to us. They may say, it's like, we don't want to monitor our affiliates, or we can't monitor our affiliates... whatever, we'll take the traffic because there's no downside. It's a different way that they may look at it. And you know, it depends what position they're in, and more, how much they care about their brand, or don't care...

o Mike:

And a similar kind of thing happens on the paid side. I don't want to get too much into that because this is the organic side and I don't want you to get too embroiled in that as I don't know if you're much connected with it. But in PPC with a campaign you can only bid once on the same keyword. It's not possible for you to fix it so that you can turn up at one, two and three on the paid search side. So, what tends to happen there is that, the merchants don't mind if the affiliates are bidding on the same keywords. So one way or another, it's likely that, if they can't hold all the positions down the right hand side, the affiliates will help them. And at least that way they get the sale anyway.

What is it that gets you banned - if at all? Is it cloaking, mini networks...

o Jon:

Mike there isn't an exhaustive list. There are new technologies coming out all of the time. At the highest, or fundamental level, someone who is doing something for the intent of distorting search results to users... that's pretty much the over arching view of what would be considered a violation of our content policies. In terms of specifics... um.. let's do some notes on cloaking
. If you're showing vastly different content to different user agents... that's basically cloaking. Two different pages - one for IE and one for Netscape with the formatting difference between those, or having different presentation formats for people coming in an a mobile device perhaps, or just different type of GUI that's acceptable. That's helpful.


Massively interlinked domains will most definitely get you banned. Again, it's spotted as an attempt to distort the results of the search engine. The general rule is that we're looking at popularity on the web via in-links. The links are viewed as votes for other pages. And part of voting is that you can't vote for yourself. And people who buy multiple domains and interlink them for the purpose of falsely increasing popularity, are doing that, just voting for themselves. And the same applies with people who join reciprocal link programs. Unfortunately there are many people who join these because they're fairly new to search engine marketing and maybe someone tells them that this is a great way to do things. That's very dangerous. People linking to you for financial or mutual gain reasons Vs those linking to your site because it's a great site, a site they would go to themselves and would prefer their visitors to see, are doing it the wrong way. Let's just take the travel space again. Someone who has 30 pages of links buried behind the home page, literally each with several hundred links, with everything from... golf carts, to roofing, to... who knows. You know that's kind of like: hey if you like our travel to Jamaica site, you may also be interested in our roofing site... [Mike and Jon burst out laughing here]

o Mike:

It's a shame really. People seem so desperate for links but frequently just have no idea where they're going to get them from. It's my mantra over and over again, and I know you've heard me saying it many times at the conferences: the importance is in the quality of the links you have - not the quantity. And of course, everyone wants to do incoming links. They don't want to do reciprocal linking. They even worry too much about whether they should link out themselves. Getting links in is a lovely blessing, but should people worry too much about linking out?

o Jon:

The thing to remember here Mike, is about who you're linking out to. If you hang out in bad neighbourhoods as we say, then you will get more scrutiny, that's inevitable. If you end up linking to a lot of people who are bad actors and maybe have their site banned -- then you linking to them means you're more likely to be scrutinised to see if you're part of that chain. The other thing, of course, is, when you take a look at connectivity, every site has a certain amount of weight that it gets when it's voting on the web and that is based on the in links. And they get to distribute that...energy... via its out links. And by that, I mean outside the domain. Navigational links and other links within a domain don't help connectivity, they help crawlers find their way through the site. I'm just talking here about the true out links. Those outside of the domain. For those... how much each link counts is divided by the number that exists. So if you have a couple of partners, or suppliers you're working with and have an affinity with, if you link out to them - then that helps a lot. If you have... 3,4,5 of them... well if you added 300 random reciprocal links, then you've just diluted the value of the links that you gave to the other people you have the real relationship with. It's as simple as this, people who have massive link farms aren't really giving much of a vote to anyone because they're diluting their own voting capability across so many other people. So you need to consider the number of out links you have on a page, because each additional link makes them all count for less.

Google

Saturday, May 01, 2004

Features: "TOP 10: 'Portal' sites 26/April/2004 Top 10 UK 'Computers and Internet' sites for the week ending 24 April, 2004, based on visits.

1. (22.74 %) MSN UK http://www.msn.co.uk
2. (15.06 %) Google UK http://www.google.co.uk
3. (10.64 %) Freeserve http://www.freeserve.com
4. (6.07 %) Yahoo! UK & Ireland http://uk.yahoo.com
5. (6.04 %) MSN.co.uk Search http://search.msn.co.uk
6. (5.14 %) Yahoo! Europe Mail http://uk.mail.yahoo.com
7. (2.99 %) Ask Jeeves UK http://www.ask.co.uk
8. (2.75 %) My Yahoo! UK & Ireland http://uk.my.yahoo.com
9. (2.33 %) Yahoo! UK & Ireland Search http://uk.search.yahoo.com
10. (1.81 %) Faceparty http://www.faceparty.com "

Google
Creative Commons Licence
This work is licensed under a Creative Commons License.