Entries tagged with “Search”.


Image representing Twitter as depicted in Crun...
Image via CrunchBase

Looks like my prediction on Google PageRank was right. I had predicted it for December last. – Jan mid. ;)

I could see a Google Page Rank (toolbar) update on few of my blogs, just minutes earlier. Last time it was in September, so three months down, I think it’s happening. (No change in DailySEOblog PR yet.)

No official news from Google yet, but the toolbar page rank update is happening right now. Check your blogs.

You can use these tools to check the Pagerank.

google-pagerank-update

UPDATE:
Matt Cutts has just confirmed on Twitter that there’s been a Toolbar Page Rank update today.

matt-cutts-pgerank-update


Daily SEO blog – SEO Tips and Social Media for the learner

Google PageRank update – December 31st 2008

Related posts:

  1. The real Google Page Rank is neither what you see nor what you can calculate I mean it. Really. Vanessa Fox (ex-googler) explains why there…
  2. 18 Steps you should consider to get a higher page rank for your blog this update The last Google Page Rank update  just got over…
  3. Prepare your blog for this Google PR update Here are 10 things you should gear up your blog…
Reblog this post [with Zemanta]

Numeric examples of PageRanks in a small system.
Image via Wikipedia

I’ve heard from several clients today who had their page ranks change both for the better, and worse (my own hardware website went from a PR5 to a PR4). However it seems that a re-occurring bug where Google drops the page rank of internal pages that deserve to be ranked has snuck into this update. After most page rank updates I usually check out one of the previous employers websites to see how

Reblog this post [with Zemanta]

Matt Cutts of Google - Wordcamp 2007
Image by Randy Stewart via Flickr

Matt Cutts did a great interview with Mike MacDonald from WebPronews. The discussed Google in 2009 and answered a lot of questions that I know I’ve heard from a number of clients and confirmed a few things I believed to be true (mainly Matt’s comments on sub-domains near the end).

The video covers personalization (will it kill search results and SEO). The future of SEO and how we need to expand into other areas including usability and conversion optimization, Flash and video ranking and sub-domain and how they can be used and black hat SEO. Matt answers the often-asked question: do sub-domains work better than files in a site and when?

I won’t repeat everything from the video – that would be redundant. It’s a 10 minute video and worth every minute of your time. Enjoy …


More WebProNews Videos
Reblog this post [with Zemanta]

Image representing Google as depicted in Crunc...
Image via CrunchBase

Or how to make Google pay more attention

By Jason Lee Miller – Tue, 01/22/2008 – 12:15pm.

For a while now webmasters have fretted over why all of the pages of their website are not indexed. As usual there doesn’t seem to be any definite answer. But some things are definite, if not automatic, and some things seem like pretty darn good guesses.

So, we scoured the forums, blogs, and Google’s own guidelines for increasing the number of pages Google indexes, and came up with our (and the community’s) best guesses. The running consensus is that a webmaster shouldn’t expect to get all of their pages crawled and indexed, but there are ways to increase the number.

PageRank

It depends a lot on PageRank. The higher your PageRank the more pages that will be indexed. PageRank isn’t a blanket number for all your pages. Each page has its own PageRank. A high PageRank gives the Googlebot more of a reason to return. Matt Cutts confirms, too, that a higher PageRank means a deeper crawl.

Links

Give the Googlebot something to follow. Links (especially deep links) from a high PageRank site are golden as the trust is already established.

Internal links can help, too. Link to important pages from your homepage. On content pages link to relevant content on other pages.

Sitemap

A lot of buzz around this one. Some report that a clear, well-structured Sitemap helped get all of their pages indexed. Google’s Webmaster guidelines recommends submitting a Sitemap file, too:

· Tell us all about your pages by submitting a Sitemap file; help us learn which pages are most important to you and how often those pages change.

That page has other advice for improving crawlability, like fixing violations and validating robots.txt.

Some recommend having a Sitemap for every category or section of a site.

Speed

A recent O’Reilly report indicated that page load time and the ease with which the Googlebot can crawl a page may affect how many pages are indexed. The logic is that the faster the Googlebot can crawl, the greater number of pages that can be indexed.

This could involve simplifying the structures and/or navigation of the site. The spiders have difficulty with Flash and Ajax. A text version should be added in those instances.
Google’s crawl caching proxy

Matt Cutts provides diagrams of how Google’s crawl caching proxy at his blog. This was part of the Big Daddy update to make the engine faster. Any one of three indexes may crawl a site and send the information to a remote server, which is accessed by the remaining indexes (like the blog index or the AdSense index) instead of the bots for those indexes physically visiting your site. They will all use the mirror instead.

Verify

Verify the site with Google using the Webmaster tools.

Content, content, content

Make sure content is original. If a verbatim copy of another page, the Googlebot may skip it. Update frequently. This will keep the content fresh. Pages with an older timestamp might be viewed as static, outdated, or already indexed.

Staggered launch

Launching a huge number of pages at once could send off spam signals. In one forum, it is suggested that a webmaster launch a maximum of 5,000 pages per week.

Size matters

If you want tens of millions of pages indexed, your site will probably have to be on an Amazon.com or Microsoft.com level.

Know how your site is found, and tell Google

Find the top queries that lead to your site and remember that anchor text helps in links. Use Google’s tools to see which of your pages are indexed, and if there are violations of some kind. Specify your preferred domain so Google knows what to index

Reblog this post [with Zemanta]

MOUNTAIN VIEW, CA - NOVEMBER 14:  Eric Schmidt...
Image by Getty Images via Daylife

CBC News writes that Google, one of the staunchest supporters of net neutrality, is under fire after a report in the Wall Street Journal suggested the search company was backing out of supporting an open and equal internet.

Citing internal Google documents, the newspaper on Sunday said the search engine company was quietly negotiating with large phone and cable companies to speed up the services it delivers. The arrangement, internally called OpenEdge, would see Google place its servers directly within the networks of internet service providers, which would let end users load YouTube videos and other content from the company faster.

The deal would establish a two-tiered system that would give Google’s traffic preferential treatment over other internet content, the newspaper said.

One major cable company involved in the talks said it had been reluctant to strike a deal with Google because a two-tiered system would violate U.S. net neutrality principles that discourage internet service providers from interfering with traffic.

“If we did this, Washington would be on fire,” an unnamed company executive told the newspaper.

In a Monday blog post, Google’s Washington telecom and media counsel Richard Whitt admitted the company was negotiating with internet service providers to speed up its content, but disputed the “hyperbolic tone and confused claims” of the Wall Street Journal story.

“I want to be perfectly clear about one thing: Google remains strongly committed to the principle of net neutrality, and we will continue to work with policymakers in the years ahead to keep the internet free and open,” he said.

Whitt said Google’s deal would be non-exclusive, which means that any other internet content provider — Yahoo or Microsoft, for example — could make the same arrangement.

“Google has offered to ‘co-locate’ caching servers within broadband providers’ own facilities; this reduces the provider’s bandwidth costs since the same video wouldn’t have to be transmitted multiple times,” Whitt said. “We’ve always said that broadband providers can engage in activities like co-location and caching, so long as they do so on a non-discriminatory basis.”

The newspaper also said some net neutrality supporters, including president-elect Barack Obama, have been changing or softening their views. Obama’s net neutrality plans are “much less specific than they were before,” the Journal quoted Whitt as saying.

Whitt, however, said he was misquoted.

The rest of the CBC News article is here.

Reblog this post [with Zemanta]