Highlight links internal
external
nofollow

Posted on by David Harry


Given that this is a site with an awesome tool for viewing a website (as a search engine would), it seemed only fitting that we should take things a step further. Many in the search optimization world believe they 'get it', but one can't be too sure.

A lot of the time we hear about things like 'links', 'meta tags' 'title' elements. But that's really a limited view of what search engines (like the almighty Googly) are doing when they assess your site. I thought it would be an interesting exercise to take a bit deeper walk through the woods.

Walk with me....

Forest

Pages v Websites

One of the first things we need to understand is there is indeed elements that differ between a page and the entire domain. This one is interesting in that many times I get the sense that SEO folks don't always understand that. In fact, most things that a search engine does is actually on a page level, not the domain level.

In fact, outside of links, the only really important areas that tend to be site-wide are Trust elements, classifications (topical etc), internal link ratios and geo-localized ones. By and large, a search engine actually see's your site on a page by page basis. This is the first important distinction to keep in mind.

Site-Level Signals

But what things does Google see on the site level?

Authority/Trust; this is the all-encompassing concept of not only what you do on your site (outbound links, web spam, sneaky redirects, thin content etc) but what is going on off-site (link spam, social spam etc). What level of trust does your website have in the eyes of the search engine? This is something that is incredible hard to build, but easy to lose.

Thin Content (formally known as Panda); while related to the above, it's worth having on it's own. Large amounts of thin content and/or dulpication could end up in dampening of sections or entire sites. We can also consider the GooPLA (Google Page Layout Algorithm) conepts here as well.

Classifications; while this does generally exist more on the page level, there are categorical elements for an entire website. These can also contain more granular elements as well. From an ecommerce site (or subdomain etc) to a given market. This is where strong architecture can become your best friend. Assist the search engine in understanding (and classifying) what your site and it's various parts are about.

Internal link ratios; in simplest terms you want to show a search engine the importance of pages through internal links. Linking to the most important pages the most, the least important the least. This can often lead to page mapping issues (wrong target page ranking).

Localization; another element this is more site-wide in nature is localization. Meaning; if applicable, where does this entity reside? What areas do they service? We can even consider geographic targets for sites not directly geo-location related. Elements here can include the top level domain, language etc.

Entities; an entity is a person, place or thing. One needs to look no further than Google's knowledge graph to see the importance they are seemingly placing on these over the years. If it is brands on the page (ecommerce) or citations in an informational piece, make them prominent. Also, Google seems bullish on authorship of late, so also consider the company entities (people) and how they can be leveraged. Your site can 'be' an entity as well as having sub-entities and associations throughout.

Domain history; Matt Cutts recently talked about how a domain that's REALLY had issues, might even carry penalties after someone else buys it. Given that, we do know to some extent Google can potentially look at the domain history in classifying a website. This of course plays back into the above 'trust' elements.

thinking lady

Page Level Signals

As I mentioned earlier, Google (and most search engines) often look at things on a page by page level, not site-wide. This is an important distinction that often SEO types seem to forget. They crawl PAGES.

What elements might a search engine look at on the page level?

Meta-data; the easiest one here is of course the header data from TITLE to Meta-descriptions (not a ranking factor) and even canonical and other tags. Some of these elements can be ranking factors while others (rel=canonical for example) can tell Google how to treat the page.

Classifications (and Localization); much like the site level elements pages themselves will fall under classifications. It could be content type, (informational, transactional etc), by intent (commercial, knowledge), localization (about a given region) etc. Ensure you communicate the intent and core elements of a given page.

Entities; Again, as with the site-level, entities can also be associated with a given page on the site. In fact, Google potentially now looks at associating query inference (medical symptoms) and an entity (the condition itself).
Authority/trust (external links); beyond the above mentioned authorship elements, trust signals can also bee seen in what you link to (as well as citations). It can be a positive or a negative. Take the opportunity where you can to associate the page with other authoritative web identities.

Temporal signals; on the page level, Google is potentially looking at elements such as; Document inception/age, freshness, (QDF et al), niche trends, content update rates and more. They might even look at historic query and click data.

Semantic signals; web pages have words right? Search engines love words, yea? Then be sure that some form of semantic analysis is going on for the page including categorization of content, related term/phrase ratios, citations and more.
Linguistic indicators (language and nuances); of course as part of the above classification methods, the page can also help through closer demographic identification through the language on the page.

Prominence factors; another area that doesn't get a whole lot of attention but is know to show up over the years in various patents are elements such as; headings (h1-h5), bold, lists (and possibly italics). They don't likely weight heavily, but are indeed worth consideration.
Oh and I do not jest when I say there's more. We're sticking to some of the higher level parts to get the point across. We've likely already lost ½ the readers that started this piece. Thanks for sticking around.

The off-site stuff

While the focus of this offering was more about the on-site bits, Google also see's the off-site activities as part of it's perception. From authority and topicality to demographics and categorizations, it does come into play as far as how Google perceives you as an entity/set of entities.

These can include....

Link related factors;

  • PageRank (or relative nodal link valuation)
  • Link text (internal and external)
  • Link relevance (global and page)
  • Also; Temporal, Personalized PageRank and Semantic analysis

Temporal;

  • Link velocity
  • Link age
  • Entity citation frequency
  • social visibility

Authority/Trust;

  • Citations
  • Co-citations
  • TrustRank type signals

Reach;

  • Links
  • News
  • Social
  • Video

You get the idea. I don't really want to focus on the off-site elements as much today. We're looking at these as they do fall into 'how Google see's your website'. It's just not the focus...

Moving along...

matt cutts combating link spam

The Spam Connection

Another loosely related element of how Google perceives the site is of course; adversarial information retrieval. Better known as search and destroy for web spam. While understanding ranking factors is a great idea, it is also good to know things that might get one dampened as well.

Web spam generally breaks into two categories;

  • Boosting; tactics used to increase a ranking (link spam for example)
  • Hiding; tactics used to mislead or trick the search engine (cloaking for example)

Again, this isn't the core focus, so be sure to read my post on web spam for more. Some common spam fighting elements include;

Content Spam

  • Language:
  • Domain:
  • Words per page:
  • Keywords in page TITLE:
  • Amount of anchor text:
  • Fraction of visible content:
  • Compressibility:
  • Globally popular words:
  • Query spam:
  • Host-level spam
  • Phrase-based

Link Spam

  • TrustRank:
  • Link stuffing:
  • Nepotistic links:
  • Topological spamming (link farms):
  • Temporal anomalies:

We're all now fairly familiar with Panda (type) and Penguin devaluations as well as manual actions such as the Unnatural Links messages. But one should also be cognisant of the myriad of other ways Google might be looking at your site, in terms of how spammy it is.

seo is dead

SEO is dead

Right? Ok, maybe not.

Your mission my friends, should you choose to accept it, is to develop an SEO strategy that covers off all the elements within this post. Because that my friend, is what the savvy search optimizer should be doing.

I could sit here and explain to you ways to leverage them all, but this is an article not a book.

My goal here today was to bring light to the complexity of our reality. If you're a website owner, SEO enthusiast or hard core optimizing guru. Never become myopic on how Google really works. See the forest, see the trees and even the leaves.

As you were.....

Images by David Harry

Posted on by David Harry | Posted in How to, Opinion


Posted on by Sante J. Achille


There’s a saying you’ve certainly heard before, especially if you work in an industrial environment that goes:

“If it’s not broke Don’t fix it!”

This piece of popular and very down-to-earth common sense recommendation is especially relevant to the ultra modern and technologically evolved web story I’m about to tell.

Here’s the story.

A few days ago I was approached with a “new job” by my web design partner. He sent me an email: “I was contacted by this agency for a quote to get their website indexed”.

I was travelling on my way home from Bologna where I had been in a conference. WIFI on trains is always a complicated issue but I gave it a try and ran a few of the typical checks.

When non tech savvy people request a site to be indexed they mean, “we would like to appear in the SERPs for relevant keywords our target audience is using to identify companies such as ours”. After a few minutes I realized they actually meant what they had said – their website wasn’t in any search engine Index.

Incredible – I hadn’t seen anything similar in many years.

There were plenty of healthy looking html links pointing to content rich pages, all hidden in the shade of the invisible web.

Why?

As I’m not a coder most of the scripting of the home page header didn’t mean very much to me so I turned to my friend BrowSEO for an opinion and I nearly couldn’t believe my eyes when I saw this …

The mumbo jumbo. Home page text and links were unreadable code.

I sent this screenshot back to webmasters with the question: “what gives here guys??!!” – within minutes they came back and asked me to run the page again and the home page with BrowSEO was looking like this

Quite a difference.

The following day the website was showing up in the Google Index and found its way into Bing as well.

What They Had Done …

Something had gone wrong with the way they applied file compression. I don’t know exactly which technique or how it was applied but the end result is what you see in the first screenshot of this post. So clearly something went wrong with the way they applied it, because if implemented correctly this is not a problem.

Your Takeaway

Do you basics first. Clean up styles and javascripts, test your site for speed, and only then proceed to a further implementation to increase performance. Always test and check what you’re doing. You never know when you’re going to make your next mistake.

Posted on by Sante J. Achille | Posted in Opinion


Posted on by Terry Van Horne


Browseo is a wonderful tool for browsing pages through the eyes of a search engine. When it comes to analyzing site structure and other factors that pertain to an entire website, Screaming Frog is the tool of choice for many to use alongside Browseo. Both tools together are a perfect match.

Besides Browseo, Screaming Frog is one of the most powerful tools a technical SEO has in his toolbox. That said, the recent additions to the software have made it a "must have" for every SEO. Out of necessity I built a similar spider back in the 90's so I know how hard it is to build a spider to evaluate site architecture and structure. The new features take "the Frog" from being a mainstay in the SEO toolbox to a tool that webmasters should be looking at to monitor site health and stay ahead of what can become very large numbers, very fast, in Google Webmaster Tools.

The Sitemap Generator

Yeah! I know you are thinking "big deal there are a bunch of free tools that do that very well!". So webmasters generate it and adjust as content is added. Yes that insures indexing but does it enable you to identify new content or changed content? Likely not. Whereas Screaming Frog includes the last modified date from the header and from that you can learn:

  1. what is new on the site
  2. what has been changed
  3. when these events occured

Last modified is a very important value in the header as Google for sure uses it to partly determine what is fresh and when indexing refreshes should occur. Since Google has been known to favor freshness a few SEO's and spameisters have made their last modified dynamic. Sorry folks, Google is not a fencepost so I would be very, very careful with that stuff. So we have this XML sitemap with last modified how do we use this to our advantage?

Simple! Open Excel import the xml file and then sort (descending) on the last modified field. Voila! At the top of the list are the new and changed files! I suggest a weekly indexing and saving your last indexing for comparison. Again, When your client screws up and potentially tanks their rankings by say .... I don't know... setting the "discourage Search Engines..." setting in WordPress...  Don't laugh I've seen it happen more than once! With weekly indexing you may even be able to catch it before the Search Engines do.

Lists

This feature is amazing for testing how spiders react! Testing robots.txt and .htaccess is now as simple as adding the list of URLs you want to test into Screaming Frog! The first thing I do for site migrations or redevelopments is index the site just before the migration/changes take place. That way I have lists to work from for my robots.txt and .htaccess files. Once those are written and the development/migration is complete I use the list feature in Screaming Frog to test the links, robots.txt and redirections.

Speed

This is very cool because you can get an idea in real seconds which makes sense whereas Google tool uses some cockameme grading thing. However the Google tool is very useful in figuring out exactly what page elements are responsible for the slow load speed. I like to make decisions on "real numbers" not some formulas. How about you?

So, what do you do when you see a need for speed? Use Screaming Frog to identify the problematic pages then you run the pages that need improvements through the Google tool to see the best way to improve the load speed. This saves you time and enables you to make more informed decisions about what pages to implement changes on because they are based on real numbers that you can understand and know exactly how it affects the user experience.

Conclusion

There is lots of other new features in Screaming Frog I strongly suggest you learn what data it collects and as you go through that the ways that it can benefit you will become endless. That and the the never ending additions to this program like Video and Media sitemaps means this will be a tool you'll rely on for many years so learning what it does and how will make you that much better as an SEO or webmaster!

Posted on by Terry Van Horne | Posted in Opinion


Posted on by David Harry


If there was an underrated tool out there that a lot of SEO folks don't seem to fully utilize, it's Google Webmaster Tools. Not only do the waters of knowledge most have seem murky and grey, the value it contains falls into the abyss. There's even been occasions where those I've worked with ask inquisitively as to why access was needed for something as simple as a website audit. Seemingly I live in a parallel universe.

But enough of that. Let it just stand that you should spend some intimidate moments in there as it's a core element of health awareness for your site. For now, I just wanted to share a recent Google video that might help entice you deeper into the rest of the data.

Using Search Queries to improve your site

The vid deals with the 'Search Queries' report (found in the 'Traffic' menu'). But what makes it truly of value, is the strategic considerations that emerge. Indeed, even enlightened me to a few things not currently in the bag of tricks.

Video highlights

Below are some notes and points that came across from the vid. Revisit them often...

General notes;

  • Impressions only count if it was showed to a user. If they didn't go to page 2 (where yer listing was) it won't count.
  • Avg position doesn't count multiple listings on a single SERP. Only the FIRST.
  • Qualified traffic = targeted traffic to terms. Can adjust the on-site to adapt where it doesn't align.
  • Sort by clicks, not impressions; gives a better sense of those actually reaching the site.
  • Look for qualified and unqualified traffic
  • Look at 'pages' to ensure the right pages come up for the terms (page mapping)
  • If you've duplicate issues; use 301 or rel=canonical
  • Look at CTR to ensure optimal. If not, look at the SERP display to potentially improve.

Understanding the audience;

  • What are the goals of the site/business?
  • What groups are you targeting (demographic)?
  • Where are they located?
  • What devices are they using?
  • What are their objectives?
  • Do their objectives align with your site/company goals?
  • Do their query terms match your content?

Investigating top queries

  • Are these the queries I'd expect to see?
  • Does it seem like the clicks would be qualified traffic?
  • Can the display of my page in the SERP be better optimized for this query?

Investigating categorized queries

  • What is the searching trying to do?
  • Where is the searcher located? Will their device change behaviour?
  • Is the SERP display for the page compelling to click?
  • If the searcher selects the the page in the SERP; will the page match their expectations?
  • Is the site providing a good UX and will they become repeat visitors/customers and/or recommend?

Top pages sorted by impressions

  • Google likely considers these to be valuable pages on your site
  • If relevant for the user, linking top pages to your high quality, but lower ranking pages may help increase their visibility.

Optimizing top pages

  • Try to accept that your top pages for Google searchers and Google might not be what you originally imagined. Work with it (aka page mapping)
  • Check that top pages are user-friendly and conversion friendly (user metrics and potential links)
  • Consider using top pages to internally link to your high quality, but lower ranking pages. (internal link ratios)

The hidden link

One of the more interesting take-aways for the budding search geek is that she mentions internal linking practices a few times.

Consider using top pages to internally link to your high quality, but lower ranking pages.

The concepts relating to 'internal link ratios' are rarely talked about in SEO circles, which is a shame. The concept can be simplified;

To a search engine internal links can be much like their external cousins. The webmaster will highlight, through internal links, the most important pages on the website. Or simplified;

  • You link to your most important pages the most
  • You link to you least important pages, the least.

But is this always the case? In my experience as a consultant, one that believes in a huge value laying ON THE SITE, it's rarely given much consideration. Poor execution can lead to page mapping and ranking issues and much more.

But that's another report in Google Webmaster Tools. We'll leave that for another day. The ultimate goal for now is that those manic optimizers out there take a second, or deeper, look into what's available. Stop searching for the latest link building article. Stop finding new ways to spam the social web. Search optimization starts at home; on the website.

Until next time; play safe.

Posted on by David Harry | Posted in How to, Opinion


Posted on by Jonathan Schikowski


Here we go:

A while ago I created a short tour of browseo.net and I just realized that I haven't embedded it on the site yet. It should give you a pretty good overview of what the tool is capable of and how it works. There have been a few additions to the feature list since this video was made, like the multiplier and our cloaking detector, but all the basic functionality is covered.

Posted on by Jonathan Schikowski | Posted in How to


Posted on by Sante J. Achille


Surfing the web today is pretty much like watching TV or listening to the radio. It has become part of our everyday life. No matter where you are or what you are doing there will be someone on your path with a cell phone  or a tablet streaming video, music, reading a book or newspaper ...

The success of a technology is directly related to how "invisible" it is

The web wouldn't be what it is if we still had Gophers to find documents or WAIS to query databases ... just too complicated for the average Jo and Jane.

Technology is there and is shaping the way we live - a relentless change with sweeping permanent effects on how we communicate and organize our lives.

Eating entices the appetite

Our appetites for information drives  us to search and we have great expectations of the search engines to deliver just what we want in a fraction of a second by choosing from billions of documents - call this a challenge is an understatement to say the least.

The burden is on the search engines to meet our expectations come up with the "right" solution to our problems - but how good are we at offering our merchandise (=information) to these marvels of technology?

Mole
Image © dedMazay - Fotolia.com

Our search engines are crawlers: they retrieve, store and index web pages using a very large number of "signals" to evaluate the quality, but one essential dimension of this evaluation is totally missing: sentiment. They don't see our pages, they understand them - that's quite a difference.

If the Internet is the World Wide Web, search engines are the Magnificent Mysterious Moles, not spiders - another breed of animals. Moles live underground in tunnels they burrow themselves: as they live most of their life underground, they have no need to see well. Their eyes are small, and a layer of fur and skin droops over them. The search engines carve digital tunnels into our web sites exploring our content, analysing - both server side aspects as well as social interactions. It’s important to “see” things their way.

Watching the Moles at work with Browseo

Browseo allows you to see a web page through the eyes of a search engine: no frills, no graphics, no emotions no sentiment; only information but no layout. Optimising the CMS and layout is still an important piece of the equation:

  • Important content should be offered to search engines with the highest priority
    With Browseo you can focus on analyzing the pure content.
  • Navigational menus should be pushed below the fold of the page
    You can see the actual position of navigation and content from a crawler's perspective.
  • Title tags and meta descriptions require a great deal of attention: well written personalized tags will help rankings and boost Click through Rates on the SERPs
    Check out the SERP preview to determine what a page will look like in Google's search results.
  • Server Side responses need to be checked, and should always be 200 (OK) unless you are dealing with pages which have been redirected
    Browseo lists all redirects of the URL you enter, even if there are multiple redirects in a row.
  • Visitors and search engines need to see exactly the same content
    Use the fraud detection feature to determine if this is the case. Browseo even shows you the differences in detail!

These are just a few vital signals which can have a dramatic impact on your website performance – Browseo offers a birds eye view of your content management strategy and will effectively identify those critical areas needing improvement in a matter of minutes.

Posted on by Sante J. Achille | Posted in Opinion


Comments Off on Watch the Search Engines at Work with Browseo

Posted on by Jonathan Schikowski


Introducing browseo.net's new de-cloaking engine

This just came out of our labs: you can now check for cloaking attempts with one click:

Browseo will then perform two checks:

  1. is the response code different for Googlebot?
  2. is any part of the code / content different for Googlebot?

Cloaking means serving different content to search engines and human visitors (browsers). Cloaking is considered a violation of Google's webmaster guidelines, and it probably hurts the feelings of every other search engine out there as well.

We included a link instead of displaying the results inline because this would have slowed down Browseo's performance a bit. Since the de-cloaker is kind of a special interest tool we felt it would be better to keep Browseo speedy and just link to a separate page with the cloaking detector.

Please bear in mind that this feature is very much a Beta, and that feedback is always welcome 😉

 

 

Posted on by Jonathan Schikowski | Posted in How to, News


Posted on by Jonathan Schikowski


You can now download the data of an entire browsing session in Excel (.xls) file format.

Once you open the Takeout in Excel, you'll have separate columns for URLs, responses / redirects, text information and one column for the <head> content. We are planning to separate the latter into separate columns or split up the data in some other way, so if you have any ideas feel free to let us know.

A session is not limited to browsing a single website, but includes every URL you analyze. Once you close your browser the session has ended and you'll start with a fresh session next time you use Browseo.

Enjoy!

- Jonathan

PS

Pro tip: use the Multiplier to open dozens of URLs in separate tabs simultaneously. Your .xls file will contain all those URLs (currently, this works best in Firefox).

Posted on by Jonathan Schikowski | Posted in How to


Comments Off on Data Liberation: Download Entire Browseo Sessions in Excel Format

Posted on by Jonathan Schikowski


In our first round of gathering feedback earlier this year, several professional users asked for the ability to open multiple tabs at once. Well, I'm glad to announce that now you can.

Check out the Multiplier!

Copy a list or URLs you want to analyze from your spreadsheet or any other source, paste it into the Multiplier and push the "create links" button. This generates a Browseo-link for each URL you entered which will open in a new browser tab (or window depending on your browser's settings) when clicked.

Bonus: if you select the "Open tabs" checkbox before creating the links we'll attempt to open all generated URLs in separate tabs for you. Whether or not this works for you again depends on your browser's settings (you might have to allow popups).

I just wanted to get this out to all of you quickly. We released more new features during the past few days, and I'll tell you about those in a separate post. Stay tuned 😉

- Jonathan

Oh, have you tried our free new Android app? We're getting lots of positive feedback for this first public release. I'd love to hear what you think!

Posted on by Jonathan Schikowski | Posted in How to


Posted on by Jonathan Schikowski


With our aim of providing an easy-to-use tool while focusing on what truly matters in mind, we decided to provide context for some of the data that you see in Browseo. I've been putting this off for a while and deliberately decided against including it in the initial releases because I was afraid of making any generic recommendations. Seasoned SEOs know what they are doing, and they don't need me to tell them why redirect chains might need to get fixed, or what place headings have on a page. In fact, depending on how you do your work and what you are working on, things can be very dramatically different. And reading industry blogs, it is obvious that everyone has a different opinion about those things.

Contemplating my own reactions to recommendations and context menus in various applications, I found myself complaining about recommendations that are either too generic or too specific (and hence not addressing my problems).

However, Browseo has been used by many thousands of users, and many of you use Browseo on a regular basis. You blog, tweet and talk about Browseo. Even Bing mentions us on their Blog. We are grateful to all of you for making Browseo a standard application in your SEO tool set. And because of this, we feel very much obliged to listen to your requests. Guess what the number one request we received over the last couple of months was?

Explanations.

So I've written a first version for each item in Browseo's sidebar. I am fully aware that what I've written is not perfect, and by no means comprehensive. It may even be incorrect on some cases, depending on how you use the tool and what you're looking at. So please bear with us while we try to get closer to decent explanations and recommendations. They are meant to help new users make the most out of Browseo, and to dispel doubts.

Here is what I've come up with:

Response Code

 

Ideally, you should see a 200 (OK), which means you are fine. If you see redirects, a general rule is to always use 301 redirects, and only use them if absolutely necessary. For example, redirects can be used to combine https://site.com and https://www.site.com into one. If you use several redirects in a chain, search engines tend to get lost.

Text Information

This sections informs you about the amount of words and links on the page you are browsing. Browseo displays these amounts so you don't have to count them yourself.

Head

Here you typically see several entries with varying degrees of importance. The tile and description are two of the most important parts of a page, whereas the keywords tag is ognored by most search engines. The robots tag can tell you whether there are any specific directives for search engines regarding crawling and indexing the page, while tags such as "generator" or "date" are only displayed here for your convenience.

SEPR Preview

This is Browseo's attempt in predicting what this page might look like in search results, based on the title and meta description tags. Please note that the actual snippet in the SERPS may look different, because search engines use several sources and also display different snippets based on the particular search query.

Headings

Use headings where it makes sense for users. There are six sizes of headings, H1 being the most and H6 the least important. Headings are used to describe the content of a page, and to catch the user's eye. Search engines look at them to find out what a page is about, just like they look at all of the text on a page.

If you would like to suggest alternative explanations, feel free to contact us here or get in touch with me on Twitter.

Posted on by Jonathan Schikowski | Posted in How to