How to make a part of the page non-indexable for Google Crawler and Bot

We all have faced a situation, where we want to index a page, but keep some parts of it as un-indexable. This can be for content, which is being consumed from a third-party.

For example, TripAdvisor provides user reviews and other content through its APIs. Any website can buy them and start showing TripAdvisor’s content on their site. But, as per the basic concept of SEO, this leads to Content Duplication , and may result in the website being penalized. TripAdvisor got the content first on their pages, and so they will never be penalized.

So, how do we go about making sure such content is not indexed by crawler, but still visible for User Experience ?

Google provides the googleon and googleoff tag. It is written as follows :-

All you need to do is, put the ‘googleoff’ tag, place your content which you don’t want to be indexable, and then put the ‘googleon’ tag to make the crawler resume indexing. An example is,

More about other such lesser known tweaks at :-

Google Guildelines