Tag Archives: Google

How To Allow Google to Crawl your AJAX Content

28 Oct

Today we’re excited to propose a new standard for making AJAX-based websites crawlable. This will benefit webmasters and users by making content from rich and interactive AJAX-based websites universally accessible through search results on any search engine that chooses to take part. We believe that making this content available for crawling and indexing could significantly improve the web.

While AJAX-based websites are popular with users, search engines traditionally are not able to access any of the content on them. The last time we checked, almost 70% of the websites we know about use JavaScript in some form or another. Of course, most of that JavaScript is not AJAX, but the better that search engines could crawl and index AJAX, the more that developers could add richer features to their websites and still show up in search engines.

Some of the goals that we wanted to achieve with this proposal were:

  • Minimal changes are required as the website grows
  • Users and search engines see the same content (no cloaking)
  • Search engines can send users directly to the AJAX URL (not to a static copy)
  • Site owners have a way of verifying that their AJAX website is rendered correctly and thus that the crawler has access to all the content

Here’s how search engines would crawl and index AJAX in our initial proposal:

  • Slightly modify the URL fragments for stateful AJAX pages
    Stateful AJAX pages display the same content whenever accessed directly. These are pages that could be referred to in search results. Instead of a URL like http://example.com/page?query#state we would like to propose adding a token to make it possible to recognize these URLs: http://example.com/page?query#%5BFRAGMENTTOKEN%5Dstate . Based on a review of current URLs on the web, we propose using “!” (an exclamation point) as the token for this. The proposed URL that could be shown in search results would then be: http://example.com/page?query#!state.
  • Use a headless browser that outputs an HTML snapshot on your web server
    The headless browser is used to access the AJAX page and generates HTML code based on the final state in the browser. Only specially tagged URLs are passed to the headless browser for processing. By doing this on the server side, the website owner is in control of the HTML code that is generated and can easily verify that all JavaScript is executed correctly. An example of such a browser is HtmlUnit, an open-sourced “GUI-less browser for Java programs.
  • Allow search engine crawlers to access these URLs by escaping the state
    As URL fragments are never sent with requests to servers, it’s necessary to slightly modify the URL used to access the page. At the same time, this tells the server to use the headless browser to generate HTML code instead of returning a page with JavaScript. Other, existing URLs – such as those used by the user – would be processed normally, bypassing the headless browser. We propose escaping the state information and adding it to the query parameters with a token. Using the previous example, one such URL would be http://example.com/page?query&%5BQUERYTOKEN%5D=state . Based on our analysis of current URLs on the web, we propose using “_escaped_fragment_” as the token. The proposed URL would then become http://example.com/page?query&_escaped_fragment_=state .
  • Show the original URL to users in the search results
    To improve the user experience, it makes sense to refer users directly to the AJAX-based pages. This can be achieved by showing the original URL (such as http://example.com/page?query#!state from our example above) in the search results. Search engines can check that the indexable text returned to Googlebot is the same or a subset of the text that is returned to users.

Source: http://googlewebmastercentral.blogspot.com/2009/10/proposal-for-making-ajax-crawlable.html

Google to Offer Free SEO Review, Analysis, and Advice for Your Website

26 Oct

It is not uncommon for Google staff to provide advice about how best to organise a website to assist you with your Google rankings. After all they want the best websites up there. Until now though, this only really happened at live clinics and conferences and for some people that get a little over excited at the latest Matt Cutts video. Now though, Google have issued a call via their Webmasters Central blog for people to submit their site to get a free once over with their SEO advice scrubbing brush.

“Eh…what?” …said every SEO professional on the planet…

Google Site Clinic

Google Site Clinic

Don’t worry, there are rules. You can only submit your site if it is a strictly non-profit or government site and they will be posting all the advice with screen shots of the websites etc so that all can benefit from their analysis and advice. Plus the offer is aimed at helping ‘beginners’ so will probably not uncover anything professionals don’t know already. Google has in fact already held a few of these clinics in Spain, India and Norway. Plus they have been providing similar SEO advice through a number of mediums for years so this really shouldn’t be anything to worry about.

If you have the right category of website then you can submit your site for the clinic by filling out a form. To be eligible you must:

  • Be a registered non-profit organisation
  • Must verify that you own the site via Google Webmaster Tools
  • Make sure you site meets the webmaster quality guidelines.

If you fail to meet any of these points and you just have to get some inside SEO advice, you might want to try kidnapping Matt Cutts. Although he is a ninja so watch out…

Source: http://www.moovinonup.com/blog/google-offer-free-seo-review-analysis-advice-website

Google Predicts More Social, and Profitable, Display Ads

12 Oct

Google might be late to the display advertising game, but it wants Madison Avenue to know that it will be ahead of the game in the future. At the Interactive Advertising Bureau’s Mixx conference in New York on Tuesday, Google made seven predictions for display advertising that the company thinks will happen by 2015.

Neal Mohan, the vice president for product management responsible for Google’s display advertising products, and Barry Salzman, managing director of media and platforms for the Americas at Google, who runs display ad sales, envisioned a Web where the ads are more social, mobile and real-time — and a lot more profitable.

1. Google announced two new kinds of video ads for YouTube and predicted that half of display ads would include cost-per-view videos that viewers choose to watch. On YouTube, people will be able to skip video ads they don’t like after five seconds (and the advertiser won’t pay for those views) or choose which of three ads to watch.

2. Half of the audience will be viewing ads in real-time, Google predicted. That means changing elements of ads on the fly based on things like location, the viewer’s interests and the weather. Google demonstrated technology from Teracent, an advertising company it acquired, that changes a car ad depending on whether the viewer is in a sunny or rainy place, is a woman or a man, and prefers shopping or sports. The technology would allow “millions of possible permutations,” Mr. Salzman said.

3. Google has been talking for a while about mobile being a priority and predicted that cellphone screens would be the No. 1 screen for viewing the Web by 2015. In display advertising, that means using phones to bridge the gap between a magazine ad and an online ad. An app called Google Goggles already lets people take photos of things like a landmark to search for them on Google. Eventually, people will be able to take a cellphone photo of a print automobile ad, for instance, and see the car in 3-D, zoom in and visit the company’s Web site.

Find more here: http://mediadecoder.blogs.nytimes.com/2010/09/28/google-predicts-more-social-and-profitable-display-ads/

Google’s New Service: goo.gl URL shortener

8 Oct

Google gave its URL shortening service goo.gl a standalone site on Thursday, allowing users to input and shorten links.

The service allows users to take any link and transform it into a shorter goo.gl URL. Like the majority of Google’s services, the shortener will give users additional functionality if they sign-in with a Google account, such as a history of their shortened URLs along with analytics to allow them to track traffic.

The shortening service was originally announced in 2009 and plugged into Google’s browser toolbar and its Feedburner RSS service. Subsequently, the service was rolled out to other Google products including News, Blogger, Maps, Picasa Web Albums and Moderator.

The service will compete with other URL shortening services, such as Bit.ly and Twitter’s just-announced t.co.

Applications for the shortened links include transmission across microblogging services such as Twitter, which has a 140-character posting limit and encourages brevity. Also, by virtue of owning the system which transforms the links into shortened ones, Google will gain information about which links the consumer wants to make shorter.

Source: zdnet.co.uk