Showing posts with label seoupdates. Show all posts
Showing posts with label seoupdates. Show all posts

Monday, 23 December 2013

Google's New URL Removal Tool is Very Professional.

Google launched improved public URL removal tool to make it easier to request updates based on changes on other people's websites.
This tool is useful for removals on other peoples' websites. You could use this tool if a page has been removed completely, or if it was just changed and you need to have the snippet & cached page removed. If you're the webmaster of the site, then using the Webmaster Tools URL removal feature is faster & easier.
Google's URL/Content Removal Tool Wizard
You can remove complete pages if the page is actually not live or blocked to spiders. You can remove content from a page if the content shows in the Google cache and is not live on the page.

Tuesday, 17 December 2013

How does Google manage duplicate content?

How does Google manage duplicate content?/How does Google treat duplicate content?/How does Google tackle duplicate content?

Matt Cutts ,Google’s head of search spam, a video about duplicate content Can having duplicate content on your website affect rankings in Googles' search results? Matt's response: duplicate contents happens.





"How does Google handle duplicate content and what negative effects can it have on rankings from an SEO perspective?"

Matt said that somewhere between 25% to 30% of the content on the web is duplicative and you don’t have to worry about it. Google doesn’t treat duplicate content as spam. Matt tells us Google looks for duplicate content and tries to group it all together and treat it as one piece of content..For more details see this video






Tuesday, 10 December 2013

Matt Cutts Said How Can A Site Recover From A Period Of Spamming Links?

Latest video, Google’s Matt Cutts published a video answered the question ”How can a site recover from a period of spamming links?”
The example given: How did Interflora turn their ban in 11 days? Can you explain what kind of penalty they had, how did they fix it, as some of us have spent months try to clean things up after an unclear GWT notification. How to help report false positives?

Monday, 9 December 2013

Matt Cutt Tweets Google Busts Anglo Rank Link Selling Service

Google has targeted yet another link selling network, Anglo Rank, as announced in a cheeky tweet from Matt Cutts:
twitter-mattcutts-there-are-absolutely-no
Anglo Rank was a paid link service with links reputed to be from multiple private networks, in order to give it the “no footprints” status it had been marketing on. Cutts’ tweet directly quoted the language from the selling features listed on their sales page.

Thursday, 5 December 2013

Google Webmaster Tools Added Smartphone Crawl Errors


Smartphone Crawl Errors in Google Webmaster Tools

Google announced last night a new feature within Google Webmaster Tools specific for tracking crawl errors on smartphones.
There is now a new filter within the crawl errors section of Google Webmaster Tools for Smartphone errors.
Pierre Far, who is the face of Google for smartphone webmaster topics, shared onGoogle+ that what is provided in this report are the "common mistakes we see in smartphone-optimized sites." Now only do these errors hurt your "site's user experience" Pierre said they can "affect your site's ranking."
Here is a screen shot of the new filter in crawl errors within Webmaster Tools for Smartphone errors:


Monday, 2 December 2013

SEO is Going to Dead, What Should We Describe This?

SEO is Really Dead!!

Buying banner ads on search? Measuring keyword success in SEO via a ranking report? Google looking more like a portal (a la Lycos back in the day)?

No, it isn't the year 2000 all over again. This is today's search environment.
The More SEO Changes, the More Things Stay the Same

SEO, at its core, has always been and will continue to be about gaining a natural presence in the search engines. Certainly, over the course of the past 10 years, what that natural presence might be has changed quite a bit (search results are no longer links to "just" 10 web pages).

The changes we've experienced over the past few years have certainly gained the attention of our industry. "Content marketing" became a buzz word. Social media marketing has become a routine part of SEO recommendations. Content promotion is key to gaining natural links.

All of these changes were brought "to a head" for me when I read an email from Melanie White, Editor of ClickZ & Search Engine Watch, about the news that SES Conference & Expo is rebranding to become ClickZ Live:

Wednesday, 27 November 2013

Matt Cutts: No Limit on Number of Links on a Page

Matt Cutts Said their is no limit on number of links on a page...


Matt cutts answering a question “How many links on a page should we have? Is there a limit?” Matt said that 100 links limit was actually removed in 2008 itself.Technically, it was not limit on number of links, it was limit on page size of 100KB, as they assumed that if one link size is 1KB, than its fine to have 100 links on certain page, but it was for old age algorithms and not valid any more for modern web to have limits on page size.

Google’s Matt Cutts posted a video explained that there is no limit on number of links which a single page can have, there can be more than 100’s of links per page if they are not of spammy nature and Google will not penalize you for that.

Matt explained that webmasters can have as many as link they want on any of web page, and there is no restrictions, with change of how web page contain rich media, images, Content and other web elements, the limit on page size which Google bots can crawl also increased and hence the number of the links.

Sunday, 24 November 2013

What would you like to see from Webmaster Tools in 2014? Tell To Google's Matt Cutts


Google's Matt Cutts gave a better chance to people on his personal blog asking suggestions for webmaster tools 2014, people what would like to see changes in Google Webmaster Tools for 2014. 



A few years ago, Matt Cutts asked what people would like from Google’s free webmaster tools, He shared a pretty big list of ideas he thinks people would like to see, but noted that the Webmaster Tools team isn't actually working on these features

This is awesome opportunity for people to be able to suggest things they want on webmaster tools and get heard by cutts and webmaster developing panel members, It is interesting to look comments who are already posted their suggestions, Everyone have own personal list of things I would love to see, and I see others wanting the same thing as well.

Wednesday, 20 November 2013

Matt Cutts said should we use disavow tool without receiving a warning!!!!

Matt says the main purpose of the disavow tool is to clean up links acquired as a result of doing bad SEO.
Matt Cutts, head of Google's web spam team, post a video about disavowing links in his latest video where a user writes in to ask:

Should webmasters use the disavow tool, even if it is believed that no penalty has been applied? For example. if we believe ‘Negative SEO’ has been attempted, or spammy sites we have contacted have not removed links.
Matt says there’s no cause to worry about disavowing links even without receiving a message about them in your Google Webmaster console.
In which cases you can use this tool even without a manual action?
1. Once you have ready to drop all other options in attempting to get those links removed from the web, it would be a perfect time to use the disavow tool.

Monday, 18 November 2013

Google’s Matt Cutts: Don’t Duplicate Your Meta Descriptions

Matt Cutts, Google’s head of search spam, answers a question about meta descriptions in his latest video where a user writes in to ask:

Is it necessary for each single page within my website to have a unique metatag description?
When it comes to metatag descriptions, Matt says there are really only two viable options. You can either have unique metatag descriptions, or you can choose not to put any metatag description at all. Definitely don’t have duplicate metatag descriptions.
A very easy way to avoid having duplicate metatag descriptions is by registering and verifying your website with the free Google Webmaster Tools console. Google will crawl your website and tell you if they find duplicate metatag descriptions.
Generally speaking, Matt says it’s probably not worth your time to write a unique metatag description for every single page on your website. Matt doesn’t even bother to do that on his own blog. 

Sunday, 17 November 2013

Conversion Metrics Now Available In Google AdWords Bid Simulator

The AdWords Bid Simulator tool now offers conversion estimates in addition to impressions and clicks to show how bid changes may affect conversion volume and values.

For each bid option shown in the simulator, the bid simulator provides the number of conversions — both 1-per click and many-per-click — and conversion values if assigned or set. Conversion estimates show how many clicks might result in a conversion within one day, and the estimates are based on “a recent 7 day period”. (Notice Google here does not say the most recent 7 days.) Because the bid simulator is based on a seven day window, conversions that occur outside that period but still within the 30 day conversion window aren’t reflected in the estimates.

Friday, 15 November 2013

Matt Cutts: Larger Websites Don't Automatically Rank Higher on Google


Webmasters have always gone by the rule that the more pages you get indexed on a website, the better it is for Google. Not only do you have a larger website overall, but you can also capture a lot of long-tail search traffic from visitors who end up on one of those many internal pages.
But does Google really bring sites higher or better if the website has more pages indexed in the search results? This is the topic of the latest webmaster help video.Does a website get a better overall ranking if it has a large amount of indexed pages?
Google's Distinguished Engineer Matt Cutts said that a website with a large number of pages won't automatically rank better than others. So adding more pages to your site won't help your home page automatically rank better than smaller sites.
However, Cutts said that a site with more pages will naturally get more traffic because each of those individual pages can also rank for their own set of search queries, increasing the overall opportunity for sites to gain visitors.
Cutts stressed how links also contribute to rankings, something that a larger site often has more of naturally.
"Now typically if the site does have more pages, it might have more links pointing to it, which means it has higher PageRank," Cutts said. "If that is the case we might be willing to crawl a little bit deeper into the website and if it has higher PageRank, then we might think it's a little bit of a better match for users queries.

Thursday, 14 November 2013

Webmasters New SEO Advice Video


 Webmasters New SEO Advice Video:

Google has put out a new video of SEO advice from Developer Programs Tech Lead, Maile Ohye. She discusses how to build an organic search strategy for your company.
“What’s a good way to integrate your company’s various online components, such as the website, blog, or YouTube channel? Perhaps we can help!” she says in a blog post about the video. “In under fifteen minutes, I outline a strategic approach to SEO for a mock company, Webmaster Central, where I pretend to be the SEO managing the Webmaster Central Blog.”
Specifically, she discusses: understanding searcher persona workflow, determining company and site goals, auditing your site to best reach your audience, execution, and making improvements.

Wednesday, 13 November 2013

Google's Matt Cutts says When Commenting On Blog Posts, Try To Use Your Real Name




In a recent video published by Google’s head of search spam, Matt Cutts talks about are blog comments with links spam?


In short, most of the time, commenting and leaving links to your site or resources is not directly spam but like anything, it can be abused.

Matt offers some tips on how to make sure your comments are not considered spam by Google or the site you are leaving it on:

(1) Use your real name when commenting. When you use a company name or anchor text you want to rank for, it makes it look like you are leaving the comment for commercial marketing purposes and thus may look spammy.

(2) If your primary link building strategy is about leaving links in blog post comments and it shows that a majority of your links come from blog comments, then that might raise a red flag.
Here Matt cuts Video:

Monday, 11 November 2013

Image Mismatch The Latest Google Webmaster Tools Manual Action Penality

Another bomb of Manual Action by Google

Image mismatch is when the images on your website do not match what is shown in the Google search results. Google words it as “your site’s images may be displaying differently on Google’s search results pages than they are when viewed on your site.” It is when you are serving Google one image and the user another image, also known as a form of cloaking – but Google doesn’t call it cloaking in their document.

This morning, I covered the first manual action publicly received for this image mismatch notification at the Search Engine Roundtable. I posted this screen shot of the notification of the action:

Infographic: How To Troubleshoot Google Authorship Issues, A Step-By-Step Flowchart

In October, I spoke at SMX East about some of the opportunities and challenges when implementing Google Authorship. At about the same time, a good friend of mine reached out to me with her authorship issue. While she appeared to have authorship markup set up correctly on her blog and linked correctly from Google+, her author image wasn’t appearing in SERPs — but did show for others writing on her blog. She’s not the first person to reach out to me with an issue like this.

Authorship setup can be confusing at best, and even when you think you have everything set up correctly, you still may not see your author image. What gives? It turns out that the author image itself can have an effect on whether your authorship snippet is displayed. In the case of my friend, her photo was a close up photo of her face, but it did not show her full face.

Tuesday, 5 November 2013

Google Webmaster Tools Adds "App Indexing"

Google has announced the addition of a new section for app owners or app developers for indexing just like a websites..For Example any time they need to change context from a web page to an app, or vice versa, users are likely to encounter redirects, pop-up dialogs, and extra swipes and taps..

A new capability of Google Search, called app indexing, that uses the expertise of webmasters to help create a seamless user experience across websites and mobile apps.

Just like it crawls and indexes websites, Googlebot can now index content in your Android app. Webmasters will be able to indicate which app content you'd like Google to index in the same way you do for webpages today — through your existing Sitemap file and through Webmaster Tools. If both the webpage and the app contents are successfully indexed, Google will then try to show deep links to your app straight in our search results when we think they’re relevant for the user’s query and if the user has the app installed. When users tap on these deep links, your app will launch and take them directly to the content they need. Here’s an example of a search for home listings in Mountain View:

Tuesday, 29 October 2013

Does a site rank better if it has a lot of indexed pages?


Google’s head of search spam Matt Cutts from discussing it further in a new Webmaster Help Video in response to the user-submitted question:

Does a website get a better overall ranking if it has a large amount of indexed pages?

Have a question? Ask it in our Webmaster Help Forum:http://groups.google.com/a/googleprod...

Want your question to be answered on a video like this? Follow us on Twitter and look for an announcement when we take new questions: http://twitter.com/googlewmc

More videos: http://www.youtube.com/GoogleWebmaste...
Webmaster Central Blog: http://googlewebmastercentral.blogspo...
Webmaster Central: http://www.google.com/webmasters/

Tuesday, 22 October 2013

Penguin is completely unrelated to Humming bird algorithm..


Hummingbird replaced Google’s old search engine algorithm. The new Hummingbird is designed to improve Google’s search algorithm so it can handle more complex queries in a conversational manner (think voice requests from a mobile device).Google launched a new "Hummingbird" algorithm, claiming that Google search can be a more human way to interact with users and provide a more direct answer."Penguin’s" is It does this to penalize websites that use one or more of the following tactics that Google has stated are against the Webmaster Guidelines.

Humming bird algorithm is less:

Google claims that its Hummingbird algorithm offers a more natural way to use its search engines..Improves the search efficiency and it can handle more complex queries in a conversational manner (think voice requests from a mobile device)..

Humming bird Effect:

Hummingbird impact was not very much noticeable. But lot of websites lost keywords rank after penguin 2.1 update. Now this is the time when everyone needs to accept that websites should be optimized for all devices, web page loading time should be less, websites have great UX etc. Now we need to work on improving domain authority of our website rather than building links. Also needs to focus on generating traffic from other sources along with Google searches.

Penguin algorithm:

The penguin is aimed at decreasing search engine rankings of websites that violate Google’s Webmaster Guideline by using now declared black-hat SEO techniques involved in increasing artificially the ranking of a webpage by manipulating the number of links pointing to the page..

Related Posts Plugin for WordPress, Blogger...