Archive for the 'search engine optimization' Category

Google’s Place Pages are Designed for Optimization

Google’s new Place Pages are designed for optimization which potentially makes them great landing pages.  Is Google positioning itself to simplify advertising for local businesses?

The downfall of most SEM offers to local merchants is that they deliver lots of clicks but few conversions.  That’s because too often nobody is optimizing the landing page (or has even defined what a conversion is).  Google has now put themselves in a position to address that by allowing the landing page to be optimized.  They could even have merchants use Google Voice if they want to optimize to receive calls.

What Does Designed for Optimization Mean

Recently I wrote in Picking Winners about the use of controlled experiments and A/B testing to optimize website performance.  Perhaps the most widely known application of this principle is the optimization of website landing pages using tools like Google Website Optimizer.

The basic idea in landing page optimization is to empirically test the performance of several different design options against some specified conversion goal.  For example, if your goal is to get people to ‘sign-up’ for something you’d test different page designs and see which one performed best.

If you want to do this easily – and especially if you want to do it using some automated process – you need to adopt a web page design that is amenable to such an approach.  Andrew Chen has written a great post on keeping the design consistent during A/B testing.  He says that the secret is to create an open design – and gives Amazon’s home page as a classic example.

Well, it turns out that Google’s Place Pages are another excellent example of open design that allows automated optimization.  Have a look at one of the example pages Google highlighted:

Google Place Page Showing Block Structure

Google Place Page Showing Block Structure

As shown above, the page is broken in to two columns and the content is organized in to various blocks.  This makes it easy for an automated process to vary both the placement and size of each of the blocks and the content shown within each block.  What’s more, you can select and optimize the look of the page based on where the traffic is coming from – varying the look and feel of the page based on how the user got there.  So, if you arrived at this page as a result of a search for ‘Tartine Bakery reviews’ the ‘review block’ might be much more prominently displayed.

The fact the pages are well suited for optimization doesn’t necessarily mean all that much.  Google is well known for being an A/B testing fanatic.  So, this may just reflect a desire to be able to more easily optimize how information is presented to users.

But it could also be a first step towards something more…

Could Google Try to Close The Optimization Gap?

Optimizing landing pages is a fairly well understood process.  Unfortunately, it’s a process that few smaller businesses have the time and expertise to perform.  So, it doesn’t get done.  And the end result is that small businesses don’t see the expected results from clicks and become discouraged.

But now Google has designed a landing page that it’s possible for a machine to optimize.

Imagine a tool that allows a local business to set up an Adwords campaign that automatically creates and tests landing pages.  The tool might suggest appropriate keyword alternatives along with appropriate landing pages and then start running the alternatives and select the combinations that deliver the best ROI.  All with minimal involvement from the business owner.  Google certainly has the scale and machine learning expertise to accomplish something like this.

What’s Missing

For one, Google would need local merchant’s to define some sort of ‘conversion event’.  This is conceptually as easy as defining a new ‘block type’ that will appear on the landing page and be optimized.  For example, a restaurant might view a phone call or an Open Table registration as a conversion event.  If it’s a phone call, I imagine the merchant could be encouraged to use Google Voice to provide a closed loop analysis of the conversion event.

Perhaps more likely than having individual merchants doing this (at least in all cases) would be a small army of SEO and SEM experts doing it on the businesses behalf – but within a closed looped system managed by Google.  Google could potentially create a whole new eco-system.

Updated (September 28, 2009): Lots of concern around a core issue of whether these pages are being indexed.  In fact, Google representatives have weighed in the comments on posts by Erin Schonfeld at Techcrunch, Greg Sterling and Mike Blumenthal.   Google is confirming that these new pages won’t be indexed directly, but they may be indexed if they are referred to by other sites.

They probably didn’t want to muddy the issue, but I couldn’t help but notice that they did NOT comment on my thesis about using these pages as landing pages!


Why I Like Adobe’s Purchase of Omniture

Ok, I know the folks at Adobe (yeah the Photoshop people) and Omniture (web analytics and optimization geeks) have been waiting for me to pronounce on their deal.  After mulling it over a bit, I’ve decided that it is good.

Judging from the twitter chatter, some found it perplexing.  And apparently the market didn’t much like it either.

I like it because it recognizes that the creation of web content should be done hand in hand with activities like Search Engine Optimization (SEO) and Search Engine Marketing (SEM) and the tools used to the analyze and measure the effectiveness of a web-site.  In today’s world, there are those who create websites and those who do SEO and SEM – and they are often blissfully unaware of each other.  That’s unfortunate given that the whole point of a website is to engage people and accomplish some commercial goal.

For a long time, the technical hurdles associated with the mechanics of creating a website have dominated the equation.  But that continues to get easier – as it should.  So rather than focus on the mechanics, you can naturally expect people to start thinking more about how to use a website as a truly effective tool.  Which should lead you to think about how you are going to structure and evaluate the content.

So, I can see Adobe creating entire new classes of tools where the very way you think about and create web pages becomes much more oriented towards optimizing and measuring the content.  For example, you might have a tool where the first thing you do for a new page is define the ‘objective’ for it (i.e. this page is intended to get people to register for the site).   From this objective you would then have tools that would advise you on ‘best practices’ for achieving it.  You would design the page from the get go to evaluate several alternatives.  The code needed to manage this would just disappear in to the woodwork.  The tools required to manage the revision of various content elements are part of the tool-set.

And there is potentially a very nice network effect.  The more people use your tools to create pages and analyze them the more data you (can potentially) collect on what works and what doesn’t.  This means you can do a better job of pro-actively advising people on what they should and shouldn’t do.  This allows the creative people to spend more time exploring new things that might work rather than wasting their time on things that are pretty unlikely to work.

Think of it as ‘objective driven design’.  I’m gonna let Adobe use that phrase if they like.

Of course, all that’s easy when you say it fast.  And difficult to execute in practice.  And I’ve glossed over at least one really important point.  What one means by an ‘effective’ web-site is a moving target.  Changes like social media and real-time media – not to mention changes in what people expect or want – mean that the very definition of ‘well designed’ is always shifting.

But that just makes it an interesting problem worth tackling.  Time will tell.

Picking Winners

Web applications allow us to quickly try out new features, presentations and approaches.  But people are terrible at predicting which changes are beneficial and which ones are neutral or even harmful.  That’s one reason why a systematic approach to the analysis and optimization of changes through controlled experimentation is important.

At it’s simplest, controlled experimentation is just trying different approaches to a problem (which can be as simple as the color used on a web page) and measuring how user’s respond to these changes.

The paper “Online Experimentation at Microsoft” (PDF) was presented at KDD 2009 and provides a great overview with many concrete examples of actual experiments run at Microsoft.  Here’s one example:

The MSN Real Estate site ( wanted to test different designs for their “Find a home” widget. Visitors to this widget were sent to Microsoft partner sites from which MSN Real estate earns a referral fee. Six different designs, including the incumbent, were tested.
treatmentsA “contest” was run by Zaaz, the company that built the creative  designs, prior to running an experiment with each person guessing  which variant will win.  Only three out of 21 people guessed the winner…

The winner, Treatment 5, increased revenues from referrals by almost 10% (due to increased clickthrough).

In general, the paper documents that even experienced experts can only pick the winners less than 1/3 of the time.  Meaning, the other 2/3 of the time they are recommending changes that are at best neutral or at worst actually harmful.

This is a non-technical paper that provides motivation for taking an experimental approach.  They also describe the many cultural barriers they encountered at Microsoft.  Overall, a very good read.  Highly recommended.

Of course, they recommend a very sophisticated approach.  But the same principles apply in a broad range of situations.  One common activity that falls in to this category is the optimization of landing pages for SEM and SEO efforts.  In these situations you are usually assisted by tools that make it easy to get the statistical analysis right.

The take home message is that successful companies are learning how to fail fast forward rather than getting stuck in endless rounds of paralysis and internal arguments.  Real-world experimentation can be the final arbiter.

via Greg Linden

Local SEO Investment a No-Brainer

Local SEO investment is a no-brainer – and it’s one that will ultimately benefit everyone trying to provide marketing services to local businesses.

David Mihm has a great post in which he describes the basics (among many other things worth your time):

Do your keyword research, figure out which phrases you want to target, claim your Google and Yahoo local listings with proper categories, submit to infoUSA, Localeze, and Acxiom (via Universal Business Listing).  That covers 90% of your bases.   This basic process for one location shouldn’t take more than a few hours.  Obviously ongoing optimization, particularly in competitive niches, requires expertise and more diligence, but think of the ‘claiming your listing’ and submission process as an analog for researching keywords and writing your ads.

Every local business should be doing this.  Today.  There is probaby nothing else they can do that will provide the same level of return on investment.  As a point of reference David points to two recent studies by Conductor and Enquisite showing that SEO has a much higher return than many Pay Per Click campaigns for larger advertisers.  And given that the state of local SEO is much less evolved today, the returns in local SEO are likely even higher.

So, why isn’t every local business not doing this already?  Because they don’t know any better.  And since there isn’t a lot of money to be made in telling them how to do this, people aren’t exactly banging down the door to bring them this information.  And those who are banging down their doors to sell them marketing and advertising services aren’t that interested in telling them about something that is nominally competitive.

This sort of thinking is seriously counter-productive.  First of all, all of the local merchants advertising efforts – both online and offline are going to work better if they are taking care of the basics.  And this applies to those attempting to sell them additional services as well.

But more importantly, by shrouding the whole process in mystery we are undermining the confidence of local advertisers.  And when they eventually discover ‘the truth’ they are going to be very dis-appointed with those who didn’t help them earlier.

Google’s Carrot and Stick Approach to Local Businesses

Google has been getting a lot of attention this week due to an Appleseque PR event.  People have dutifully reported and analyzed all the new and wizzy things Google is doing.

There wasn’t much coverage on what Google is doing in local.  There are no sexy, headlining grabbing features.  Yet, local is very much at the centre of Google’s strategy.

Google’s Local Business Challenge

Google has two challenges: first, they need all local businesses to get a proper website.  And not just any website – a good one – where good means ‘follows sensible Search Engine Optimization principles’ (which are — in any case — mostly about good writing).

Only then can they get to the second challenge: getting local businesses to advertise with Google.  Those advertising services don’t work at all for businesses without a website and they don’t work well for poorly designed sites.

The Big Carrot

Users.  And lots of them.

Google has focused relentlessly on supporting local information.  Google Maps continue it’s steady march towards becoming the dominant mapping application.  Universal search makes Google the default starting place for local.  And, perhaps most importantly, Google is having great success in mobile search – which is where local search increasingly happens.

So, they have plenty of the users that local businesses want and need.  It’s a REALLY BIG CARROT.

The New Carrots

One of the headline features from Google is something called ‘rich snippets’ which Tim O’Reilly describes in his excellent post:

Earlier this week, Google made a nod to the other side of the debate, introducing a feature that they call “Rich Snippets.” Basically, if you mark up pages with certain microformats ( and soon, with RDFa), Google will take this data into account, and will provide enhanced snippets in the search results. Supported microformats in the first release include those for people and for reviews.

So, for example, consider the snippet for the Yelp review page on the Slanted Door restaurant in San Francisco:


The snippet is enhanced to show the number of reviews and the average star rating, with a snippet actually taken from one of the reviews. By contrast, the Citysearch results for the same restaurant are much less compelling:


(Yelp is one of Google’s partners in the rollout of Rich Snippets; Google hopes that others will follow their lead in using enhanced markup, enabling this feature.)

I think it’s telling that both Tim and Google chose LOCAL examples to illustrate the point.  This rich, structured data has the potential to become a ‘definitive reference’ for local businesses — finally allowing them to ensure the correct contact and location information is propagated.  But of course, in order to take advantage – they need a website.

And, in a largely ignored announcement Google also added a New Local Business Ads Interaction Report:

Back in January, we announced that we were adding four new links to local business ads – “Get Directions,” “Street View” (where available), “Save to My Maps,” and “Send.” We added these interactive links to help Maps users find the information they seek about your business more quickly and easily, and to enable you to give those potential customers additional information about your business. Now, there’s a new report in your Report Center to track users’ interactions with these links.

Better local ads and now better reporting for them.

The Stick

If you don’t have a proper website and you don’t register your information with Google you just are not going to be visible on Google.  This is a BIG STICK.  Can a business really afford to be invisible on Google?  No.  And in order to be visible, they will need to implement a website and invest in SEO (including support for the newly announced formats).

Pragmatically, this explains why Google’s map results seem so arbitrary.  The results are being ranked on the quality of the search engine optimization of the local business websites.  This provides a very strong incentive to local businesses to adhere to what Google needs them to do.  Google’s ranking algorithm is designed to attract the attention of businesses — while not punishing users too harshly during the transition period.

And of course, once those local businesses are online and properly optimized, Google will FINALLY be able to sell them advertising.

It’s not going to happen overnight — but Google has the will and the resources to see it through.

Random Ranking: Why Are Google Maps Results so Arbitrary?

I started with this search for “restaurants, Calgary” which returns something like this:

Google Maps Search for Restaurant in Calgary

Is there any rhyme or reason to the choices in the tiny subset that are highlighted for my viewing pleasure?  Oh, I realize many people have spent a lot of time reverse engineering the algorithms and that understanding these algorithms is important for Search Engine Optimization (SEO) if you want to find your business in that anointed list.  But, I mean from a USER’s point of view – it just seems arbitrary doesn’t it?

(The only that actually makes any sense is the Earl’s e – I know they are there because they paid to be there.)

Then Sebastian Provencher suggested changing the query to “category:restaurant, loc:calgary”.  Nominally, the same thing right?  Uh, no:

Another search on Google Maps for restaurant in Calgary

A different, equally arbitrary, set of results.

Now, in fact, there really isn’t enough information in a broad query like ‘restaurant, calgary’ to give me anything very meaningful.  The answer almost has to be arbitrary.  So, here’s my beefs:

  1. The results are presented AS IF they have some sort of authority or relevant ranking.  Why not provide the user with an indication or explanation on how the results have been ranked?
  2. If they are essentially arbitrary, why not make them truly random or semi-random?  Change them up.  This would drive the SEO guys crazy but seems like it would be fairer.  Why not just give me the ability to shuffle the results?

Now, the folks at Google are pretty smart.  They certainly know a thing or two about ranking things.  So, what’s up?

(Aside: the results today seem different from the results yesterday – at least somewhat.  So, I’m thinking they might actually be randomizing the ranking somewhat.  Does anybody know?)

Twitter Updates


July 2018
« Jul