Wednesday, October 1, 2014

Use Social Media to increase your Sales

As a small business owner, there are many ways to promote your product or brand. One of these ways is free and can have a huge reach: social media. Unfortunately, a lot of the entrepreneurs I have spoken to recently feel that the effect isn’t worth the effort. We also see that in our reviews, by the way. It doesn’t matter if we review a photographer’s website or the site of an IT agency, most seem to invest little time in social media efforts or campaigns.


It’s hard to determine the ROI of a social media campaign. The tools that help in that determination are paid, most of the time. A small business owner that isn’t convinced about any ROI at all, won’t make that investment. Of course ROI is heavily depending on a number of factors. How do you convince the customer to buy in a tweet, for instance? Nigel’s comment on my previous post about social media got me thinking:


I also like social media but how do you target the “ready to buy” segment instead of people “Browsing”.
Nigel Abery, oaklaurel.com.au



You don’t buy a hammer to drive a nail in a piece of wood, but to build a bench. Growing an audience using social media, like I mentioned in my previous post, is a means to an end. The ultimate goal of all your social media efforts is of course to sell stuff and make money. It can even be the first step in a multistep process: get more newsletter subscriptions via Twitter to sell your ebook, for instance.


Now how will you be able to trigger that social audience to purchase your products or services? I did some digging on the interwebs. There is a lot to be found on the subject, but no user manual that works for everybody. Unfortunately, but not unexpected. It’s not an exact science, of course. But I’ve come up with some insights nevertheless :)


The obvious social media sales


Larger companies with a huge social media audience tweet or post their way to money. Now we have this new product, buy it. This will make your life easier, buy it. If you already have this product of us, you’ll want this product. Buy it.


It’s a direct trigger, that works due to the large audience. If you tell 1,000’s of people to buy something, you’ll get sales. That seems obvious. It will trigger at least someone for sure.


Twitter Buy Button

As seen on Mashable, for instance



In most cases, social media efforts lead to long term wins, like someone that remembers that you are selling Lego t-shirts and finds you back on his Facebook timeline. But it should be possible to get direct buys without waiting for that Buy button to become globally available to everyone.


Obviously large brands with a huge following can become social entities of their own; small business almost never can. Just the other day I was talking to a local business owner about social media (Twitter). I asked him, if his personal profile had more visitors than his business profile. It did. In most small business cases, social media isn’t a business, but always the person behind it.


Small business


Where should you start, right? I think the social media efforts should be designed around your website, to be honest. If you consider social media a serious opportunity, you should make it work with your website, not next to it. If you come up with a nice idea to promote a product on social media, Twitter is limiting your message by 140 characters. Your optimized landing page on your website doesn’t have these limitations. If the landing page is for that Twitter campaign only, you could even measure the effect of the Twitter campaign without tagging your Twitter campaign in Google Analytics (or knowing what tagging an URL is in general).


Come to think of it, social media is a lead to a sale, not the sales effort itself in most small business cases. I have no scientific numbers to prove it, but it seems to make sense. Sure, you can set up a shop app on your Facebook page, but that would be the same as a great website. You are lowering barriers as visitors can become customers without leaving the social network site. And that is just Facebook. Pinterest, Instagram and Twitter don’t have these app possibilities. Yet.


The question remains how that small business owner can use social media for immediate or future profits.


Leading them to the sale


Social media is buzz for companies and people. Social media is people talking one to many. Social media is narrowing your business communication down to a niche. People just like to browse for things they might like. Unbounce did an article on cart abandonment a year ago, stating that “56% of shoppers aren’t ready to purchase but want to save their selection for later”, in their cart. I think it is safe to say that this behaviour has not changed. So what we should use social media for in this case, is to introduce that product to the customer.


browsing for products, also on social media


Sales appear less ad hoc when it’s a new product to the customer. When there’s some kind of  buzz around your product, people might start to want your product unconsciously. It will become more and more top-of-mind, and an eventual sale will be deliberate. Even though it might seem ad hoc to the buyer at that point. This might be a long term effect of your social media efforts.


At Yoast, our main focus is SEO / UX, analytics and WordPress. Most tweets of our team are about those subjects. Creating a niche like that will give you the social following that is already interested in (one of) your products. It did for us.


It is nice to just tweet about beautiful cars when you are selling bread, but those tweets won’t make you money. Tweeting about that new paleo bread you are selling online starting today could get you (immediate) sales, though.


Start early in the process of a new product or offer; “We are releasing a brand new plugin early next week!” or “Only on sale next week, get yours!” and create scarcity; “We’ll start with a test audience of 250 people.” All the basics of sales go for social media as well. You can easily create series, repeat your offer. I tend to use Hootsuite for that. I dislike the GUI of the browser version, but like the functionality. Buffer is another great tool you could try.


Create the need or wish for your product or services. And guide the potential buyer via your preferred social media outlets to your website to close the deal.


To sum things up


Direct selling via social media is coming, and could be an opportunity for small businesses. After reading a number of articles about it, my conclusions are that:


  • I am looking forward to reading about new products on Twitter and hitting ‘Buy’.

  • I hope Pinterest won’t implement such a Buy button soon, as that will cost me my allowance and more. Having to click a link to another website is a safe barrier.

  • My gut feeling tells me too little small businesses are aware of shop apps for Facebook, like the Shopify app. Dig into that, especially when you have a local following on Facebook.

Current social media sales efforts should trigger a niche specific sale on your optimized landing page. We should write a post about optimizing that landing page. Bet you’ll be coming back for that!


This post first appeared on Yoast. Whoopity Doo!



Use Social Media to increase your Sales

SEO Is No One-Trick Pony



This past week, I was joined by Josh McCoy of Vizion Interactive in presenting an SEO workshop for attendees of the Integrated Marketing Summit in Kansas City. The workshop was four hours and the presentation totaled just more than 100 slides.


As you might imagine, there was a lot of stuff to talk about. By its nature, some of that content was a “bit” on the technical side, but we tried our best to speak “English” so that the attendees could walk away with fewer questions than they had coming in.


We wanted the workshop to be interactive, and welcomed questions. One question did strike me as something that I think too often is bantered in executive meetings throughout the world…


The comment (and question), as best as I can recall was something like, “You’ve covered a lot of technical stuff in this presentation, but can you just tell me what the one thing is that we can do to really improve our results for SEO?”


We didn’t dodge the question. I mean, if you had to pick one tactic, I would have to pick “create content.” But, that’s really too simplistic an answer.


I shared with this individual that sometimes content isn’t the answer. Each and every project is unique, the competitive set is unique, and every website (company) has its own set of unique challenges. I shared a few examples of instances where I had worked with large organizations that simply had an issue with getting content indexed. Once this “one thing” was fixed, it was a hockey stick. Traffic, in some cases, doubled. These companies already had authority built into their site (solid link profile/larger brands, etc.). And, in some cases, that “one thing” was the fact that their title tags were absolutely horrible (yes, there are still some with the title tag of “home” on their home/index page of their sites).


There are currently more than 1 million results for an “allintitle:” search on Google for “homepage” and nearly that many for the same search for “home page.”


But, these “one thing” opportunities don’t come around very often.


More often, you are engaging in an omni-channel approach to building authority, strategically developing content, technical stuff, and optimizing conversion rates as much as you are title tags.


This, in my opinion, is “today’s SEO.”


For many of you, this is not news. But, what became clear to me this past week is that there are still many who think of SEO as a “quick fix” or consider it a “do this one thing, and you’ve done SEO.” I’ve certainly read my fair share of wonderful columns detailing individual tactics that are involved in the SEO process, but I don’t know that I’ve seen one which tried to hit upon the various things that go into an SEO engagement, within the confines of one article. I will attempt to hit upon most of these, today.


Today’s SEO


A common expression in recent years has been that “if it’s digital, it’s optimizable.” If we’re really doing this well, our recipe to SEO success involves a lot of ingredients. Here are the few that jump out, to me:


Technical SEO


Once you have content, you need to make sure that it’s indexed. Developing a sound URL structure is very foundational to this effort (you would rather a URL such as www.sitename.com/products/name-of-product than www.mccoysbikeshop.com/Products.aspx?Categoryid=94&Productid=72 – example pulled from Josh McCoy’s post from two years ago on proper URL structure.) Alongside of this, you’ll want to develop and submit an XML sitemap and work on internal linking.


Content


The basis of SEO is that you have quality content that “speaks to” everything that you do. Finding the keywords/themes of this content is the art/science of SEO (you want to target keywords that have search volume, are relevant, AND that you stand a fair chance of ranking for; see Competitive Analysis, later). Once you have developed a list of targeted keywords, you must determine how these keywords match up with content that you current have on your website (or other properties – see Social Media Marketing) and what content you may need to develop (product/service page content/video/blog, etc.). When you’re creating content correctly, you are creating content that is original, high-quality content that speaks to your intended audience in the right manner, so that they might engage with the content (and your company) and possibly share that content, so that you can work toward earning links.


Competitive Analysis


As stated previously, everything in SEO is relative to the competitive set. You can spend a lot of time in this area of practice (and I suggest that this isn’t merely a “one-and-done” affair, either). At its core, the competitive analysis is about determining the opportunities that exist to do well in SEO (is there light at the end of the ROI tunnel?) and what you might need to do to be successful. My favorite tool remains SEMRush.com for a quick analysis of the opportunity. Enter in some competitor domains and see how much traction they have in Google (SEM Rush provides a “SE Traffic Price” metric, showing what it might cost – in AdWords spend – to get what these guys are otherwise getting “for free” via non-paid search). From there, you might also want to do some site: searches in Google to see how much content the competitors have indexed (and what kinds of content), to gain a sense for what you may need to build. You can also use any number of link research tools (OpenSiteExplorer, MajesticSEO, ahrefs/) just to name a few. Put all of this together in a spreadsheet and analyze the opportunity (and the opportunity cost/work) to get a sense for what the project is going to look like. A good illustration of the process, was developed by Aleyda Solis on Moz, but there are certainly more components that go into an in-depth competitive analysis. Some of those are outlined by Boris Demaria on woorank.


Analytics


To me, it is impossible to claim that you are an expert at SEO if you aren’t deep into analytics. At the end of the day, we are not optimizing for “rankings.” We are optimizing towards quality traffic increases and an increase in conversions/money.


Usability/Conversion Rate Optimization


In my opinion, an SEO engagement is all about “optimizing for results.” At the end of the day, if I’m the customer, I want money. I want ROI (more money back than I pay in). If it so happens that my “SEO company” happens to spend considerable time in usability/conversion rate optimization, then so be it. A 25 percent lift in conversions/sales is perhaps more important than a 25 percent lift in traffic (and certainly a hell of a lot more important than a general rankings increase). You can certainly use your analytics platform to identify where visitors are falling off and, in some cases, you can simply eye-ball test some things that are obviously wrong with the usability of a website. That aside, I do really like what Lucky Orange is doing with its real-time analytics product. You can see how people are navigating the site, in real time (as well as view recorded visits), gain insight on what the experience looks like in different browsers/platforms, get insight via heat maps for mouse movements, clicks, and scroll depth, and a whole lot more.


PR/SEO


For as long as I can recall, we have recognized the similarities in PR and SEO. PR “back in the day” may have strictly referred to “submit press releases to gain links,” but that has certainly not been the case for quite some time. PR is a way of amplifying your message. It’s outreach to journalists/influencers. It’s “promotion.” There’s a lot of great reading out there about how to synergize efforts. One such case study that was developed was this piece by Robin Swire on Moz.com. One very common practice for us is to set up alerts for our clients using HARO, to identify opportunities to contribute to pieces that are being published (folks seeking an expert opinion/contribution to an article that is being written). This is great for the agencies who have clients who are unwilling (unable) to commit the time necessary to write compelling “thought leadership pieces,” but may have time to contribute a few paragraphs. Often, these contributions will result in a link back to your site. Even without a link, I have think that Google is smart enough to pass some value through (as was hinted at last year, in Google’s John Mueller’s Webmaster Central hangout).


PPC


In a perfect world, you have enough money (and time) to support both SEO and PPC efforts. And, in a perfect world, there is a PPC budget that can be established/maintained for keyword research purposes. With PPC, you can obviously buy your way into position to gain traffic for specific keywords, and test them, so that you can determine if these keywords are worthy of being a part of the SEO effort.


Social Media Marketing


Much like PR, social media marketing is about amplifying content and trying to earn links, buzz, social shares, and build brand equity. The core of the effort may begin with hosting/maintaining a blog (with good/researched/resourceful content). Developing a blog is not something to take lightly. Please do NOT do this if you do not intend to maintain quality content on a regular basis. How often should you post? Every situation is unique, but I would say that if you don’t intend to update the blog AT LEAST once per week, then perhaps you should consider your options. Step one is determining how to structure your blog. There are reasons why you might consider sub-domain versus sub-directory versus a separate domain. From there, you will want to create an editorial calendar that helps to shape your content initiatives. That said, some of the best posts are those which – quickly – get posted on “hot topics” and are shared, immediately. If you are an early source on something, there is a better chance that you will earn links. Creating the content is one thing…promotion of that content, is another. This is where PR and social media promotion come in. You must get the right eyeballs on your content. If it’s engaging enough, folks will share it. If folks share it, you stand to earn a few links.


There are certainly many other elements that can go into a “full service” SEO engagement (I haven’t even talked about local SEO, video SEO, image SEO, mobile, or a number of other things), but I hope that this has helped to shape the discussion of “what SEO is,” and helps others to understand that it’s not – usually – any “one thing.”





The Original Search Marketing Event is Back!
SES DenverSES Denver (Oct 16) offers an intense day of learning all the critical aspects of search engine optimization (SEO) and paid search advertising (PPC). The mission of SES remains the same as it did from the start – to help you master being found on search engines. Register today!




Powered By WizardRSS.com | Full Text RSS Feed | Amazon WordPress | rfid blocking wallet sleeves


SEO Is No One-Trick Pony

A New Click Through Rate Study For Google Organic Results



google-name-analytics2-ss-1920


Advanced Web Ranking has released a study showing fresh data on the click-through-rate from Google’s organic search results. The data was taken from Google Webmaster Tools Search Queries reports from large accounts back in July 2014.


On average, 71.33% of searches resulted in a page one Google organic click. Page two and three get only 5.59% of the clicks. On the first page alone, the first 5 results account for 67.60% of all the clicks and the results from 6 to 10 account for only 3.73%.


5425a7cf055f58.23045383


5425a7cf533de3.12649303


Here is a chart showing the click through rate by exact position:


5425a7cf9267c3.84082544


The study was first presented at SMX East yesterday by Philip Petrescu of Caphyon and then posted on Moz.


The full details of the study break down desktop versus mobile click through rates, branded versus non branded search queries and more. You can download the full study as a PDF over here.



Some opinions expressed in this article may be those of a guest author and not necessarily Search Engine Land. Staff authors are listed here.



(Some images used under license from Shutterstock.com.)



Powered By WizardRSS.com | Full Text RSS Feed | Amazon WordPress | rfid blocking wallet sleeves


A New Click Through Rate Study For Google Organic Results

Can Google Determine the Level of Quality in Your Content?

Can Google Determine the Level of Quality in Your Content? is a post by SEO expert Andy Eliason. For information about our SEO services or more great SEO tips and tricks, visit the SEO.com blog.


Plan your product and processes by aiming for both high quality and high value to set your goods and services apart from your competition in the marketplace

In the recently released Searchmetrics Ranking Factors Study, the case for quality content is once again highlighted as a critical SEO component. There should be any number of people out there who are rolling their eyes right about now, thinking: not another post about how “content is king.” We’ve heard it before. How many ways can you keep saying the same thing?


Well, that’s what I usually think, anyway, so I was a little surprised to see it presented as something that is “becoming increasingly important” in this report. They say “this was not the case for a long time,” which I found interesting. Quality content is not becoming important, it always has been. Right?


But then I got into the report a little further and really started to see what they meant.


From a strictly SEO point of view, content was always seen as necessary, but it usually took a back seat to other, more technical parts of the craft. Why?


Well, we say a lot about high-quality content, but who is really there to judge?


Content was a lot easier in the days of keyword density and strategic keyword placement. In those days, quality was how you managed to use the most unnatural long-tail keywords in the most natural ways possible, and hoped that no one noticed that really people didn’t actually speak like that. Or maybe you could just bold your keywords, and surely that helped the quality score shoot right up. (/sarcasm)


That kind of behavior, of course, is something best left in the past. After the release of the Hummingbird update, Google began to focus even more on semantic and context-based queries.


Do Context and Relevance Equal Quality?


One of the simplest ways to define “quality,” at least from a search engine’s perspective, is by determining the context and relevance of the content. In the past, this was a simple matter of using the right keywords in the right places. We’ve moved on from that level, though, and taken a more holistic approach.


Right now, targeting single keywords – or even keyword groups – simply isn’t enough to be effective in the modern online environment. Ever since the Hummingbird update, Google has been developing a more semantic approach to search, and that means they’re looking for semantically relevant terms (the report refers to them as “proof terms”) and other relevant terms that will speak to the overall value and relevance of the content.


Under-The-Radar Keyword Research Method - Scott CowleyThis kind of “semantically comprehensive wording” certainly acts as signal that the page is relevant to a query, but consciously selecting these terms and phrases is going to be more difficult than just going through the standard keyword research. On the other hand, this should help lead to more natural content creation because if you really are generating valuable content, it should happen naturally.


The Best Part of a Semantic Focus


The benefit of this switch is that now, as writers focus on a more holistic approach, they should be able to reflect more topics in their text. This, then, makes the same page relevant for users with a varying range of search intentions. The same copy can start to rank even better for related, additional keywords without even trying. (Well, obviously, with a lot of trying, but you know what I mean.)


So, according to Searchmetrics: “If website editors want their content to rank better for specific keywords, the content should be created with the fulfillment of user search intent in mind.”


What does that mean, exactly?


It means that what we’ve been saying all along still carries a lot of weight. We always say that you should write for the user, and not the search engines. By focusing on their actual needs, you can provide the kind of quality content Google is looking for.


Is Quality about Readability?


Does your personal writing style figure into the overall quality of the content? Are you using words and phrases that are too complex for your audience just to try and sound smart? This year, Searchmetrics included a new development in its report, and that’s the legibility of the text.


It seems that the general trend is that text that is easier to read tends to rank higher. There’s even a suggestion of a mathematical formula to determine the level of legibility (get the report for yourself to check it out), but it’s unlikely that Google is using something like this to determine who well you write.


Rather, Google is equating readability with “easy to comprehend,” and so it’s probably looking at user signals, like time on the site and bounce rate to judge whether or not your users find your writing legible.


It’s interesting to note, then, that by using those signals, Google isn’t necessarily looking at quality but usability. Technically, this could mean that layout is just as important as what you say. (And when we get into rich media’s importance later, we’ll see that’s definitely a thing.)


Does More Content Equal Quality Content?


This year also saw a lot of increases in correlation to content length. The report said that: “This means that websites need to produce more content in order to remain competitive in search.”


So, that doesn’t seem to mean you need to write longer content, but just have more of it. There is a difference.


Keep in mind, though, more doesn’t automatically mean better. You still have to consider legibility and keyword/topic usage. You need to balance the amount of content with the quality features that signal your relevance. Having said that, though, it does seem that sites with more words in the copy hold onto higher ranking positions.vector internet concept


So ask yourself: is this another holistic thing? Is it about the site word count, or is this about the word count by page?


Rich Media Matters


Images and videos can always make content more appealing. They help increase the time on site and reduce bounce rate, which means they are an important factor in a definition of quality.


Image is all about style, though, so you can expect that this will only go so far. I.e. you’re not going to get more value out of relying on images alone. Right now, though, you’re better off leaning toward image rich at the moment.


So What is Quality to a Search Engine?


In the end, focusing on a single keyword isn’t really enough to show that your content is relevant and filled with quality signals. You need to look at topics and related terms. You need to see the site as a holistic thing. This will help you rank better for a number of related terms and establish your position in the top of the rankings.


Get Internet Marketing Insight For Your Company - SEO.com

Can Google Determine the Level of Quality in Your Content? is a post by SEO expert Andy Eliason. For information about our SEO services or more great SEO tips and tricks, visit the SEO.com blog.







Can Google Determine the Level of Quality in Your Content?

How I Doubled Traffic To Over 200,000 Organic Visitors Per Month, Overnight

How I Increased Traffic From 100,000 to Over 200,000 Organic Visitors Per Month, Overnight


This is a case study on how I doubled organic traffic to an existing website from 100,000 visits per month to over 200,000 visitors per month, overnight, using technical SEO and the osmosis technique.


This was done using a strategy that involves absorbing another operating website’s content, rankings, and ultimately traffic.


Here’s the website’s traffic the week before, from July 25, 2014 to July 31, 2014:


Click to Enlarge

Click to Enlarge



And here’s the same website’s traffic the week after, from August 1, 2014 to August 7, 2014:


seo-case-study_traffic-week-080114-080714

Click to Enlarge



And here’s what the lift looks like:


Click to Enlarge

Click to Enlarge



Here’s the website’s traffic for the month of June 2014 (prior to absorbing the other site):


Click to Enlarge

Click to Enlarge



And here’s traffic for August 2014 (with the osmosis taking place on 8/1/14):


Click to Enlarge

Click to Enlarge



So What is SEO By Osmosis


Just like in science where one membrane absorbs the property’s and molecules of another membrane during osmosis, this SEO strategy involves absorbing the content and rankings of another website so as to inherit it’s organic traffic.


So how do you do this?


The concept itself is actually very simple, but the implementation needs to be spot on or you could torch both websites, at least temporarily.


The process involves using one of two methods to essentially swallow the content of the other website, and then inherit it’s organic rankings and thus traffic.


How To Absorb All The Traffic


As mentioned earlier there is 2 ways to do this:


  1. Using 301 re-directs, or

  2. Using cross-site canonical tags

Both involve the same process of preparing for the cut-over; the content needs to be duplicated on the destination website by being uploaded to the destination site’s database.


If you’re going to use 301′s then it’s best to try to replicate the URL structure of the old site’s pages and if possible place these all in one new directory, so you only have to write one mod rewrite statement to repoint all of these URL’s to their new home.


In the example I use for this post I used a sitewide 301 re-direct to sub-directory on the destination website.


If you are going to use cross-site canonical tags the process has a few extra steps:


  • The canonical tag needs to be added into the head of every page that you will be absorbing.

  • Generate a fresh sitemap to make sure all pages are included.

  • Brute-force submit this sitemap through webmaster tools 3+ times each day to forge Google to crawl and index those canonical tags.

  • Keep an eye on the URL’s ranking for those pages and once you see 75%+ you need to set the old domain to return a  403 Forbidden Error.

What Makes A Good Osmosis Website


Finding a website that is a good fit for this strategy has a lot of implications, for example:


  • The website needs to already have sufficient natural rankings and a sustainable base of organic traffic, although sufficient is a relative term here and will vary depending on your comfort level for risk and investment.

  • The website should have content that is at least tangential or in some way related to that of the target destination website.

  • Ideally the website will be built on a stack using the same kind of database, for example if I was going to do this for *this* website I would look for a site running a MySQL database (even more ideal would be if it was wordpress), so I could download and then simply upload to my database.

Finding websites at this scale (~100,000+ visits per month) available for acquisition is not easy, it’s much more likely that you will be able to find sites in the 10-30k visit per month range that you will be able to pick up for a few thousand dollars based on:


  • Age of the domain

  • Authority of domain (PageRank / DA)

  • Number of unique visitors

  • Monetization and revenue

 Is This Traffic Sustainable?


You tell me, here’s the traffic numbers for September 2014:


Click to Enlarge

Click to Enlarge



Something To Be Aware Of..


It seems there is a somewhat rampant problem in Google with continuing to index old domains that have been 301′d, here are a few examples:





 


However, the meta attributes, i.e. page titles and meta descriptions as well as URL targets (destinations) are all pointing to the correct version and bringing click-throughs to the right place, so in terms of traffic this still works – but in terms of continuity of user experience, this seems way off.


With that said in my experience this problem is less significant when you go the cross-site canonical route to absorb the traffic. Thank you to Ross Hudgens for providing me with Rishi’s tweet and examples.


For Future Updates


I plan to continue to acquire more sites and fold them into this same site to see if I can grow through traffic acquisition to over 1,000,000 visitors per month. To stay tuned on my progress (including sites I’m looking at buying and details on traffic and bids/offers) that I won’t share here, join my mailing list (it’s free).


Lastly, if you enjoyed this post, and especially if you didn’t, please consider taking a moment to leave a comment – I use comments as my number 1 indicator on how well a post topic or concept resonates with readers.


Thank you.


The post How I Doubled Traffic To Over 200,000 Organic Visitors Per Month, Overnight appeared first on SEO Nick.



How I Doubled Traffic To Over 200,000 Organic Visitors Per Month, Overnight

Searchmetrics SEO Visibility Gets Fully Integrated in cognitiveSEO

Some say we are workaholics, some say we are perfectionists. What we say is that we do our best to provide our customers with the best experience they can get from using a SEO platform. The fall comes along with a partnership we are proud to introduce to you today.


The cognitiveSEO + Searchmetrics integration provides unparalleled insight for any SEO Professional.



In the following lines we are going to present you the importance of this integration and the benefits that you will get from it.


Searchmetrics cognitiveSEO Integration


What Is the SEO Visibility and Why It Is Important


First of all, you need to know that cognitiveSEO is the first well known SEO tool (and the only one at this moment) to have this sort of integration combined with all the other SEO related data crawled on demand for your site and competitors. In other words, there is no other place that you can get this variety of information for your site or for your competitors’.


What is the SEO visibility, you might ask, and why is it so important?


SEO visibility is essentially about monitoring the search performance of a website as it presents the historical development of a domain’s visibility in Google. It has 2 components, search volume and the position of the ranking keywords. It basically reflects how often a website shows up in the search results and also helps you track the “winners” and “losers” in terms of keywords, providing results based on tracked keywords. The SEO Visibility provides you with great value of the market environment and also helps you easily identify market trends and digital marketing strategies. The SEO visibility is calculated based on millions of keywords that are tracked in Google and their importance and traffic volume. Each keyword has a particular importance and based on that and the ranking of the tracked site a SEO visibility score is calculated. This score is updated weekly.


This new integration is available for all clients, regardless of the subscription they have and also on all the campaigns they are working on, whether they are One time snapshots or Recurring Campaigns.



This being said, allow me to showcase the main advantages that you will benefit from now on:


What Can You Do with the SEO Visibility?


1. Identify Historic Google Penalties for Any Site


In identifying a penalty, you should always check all possible hypotheses, by doing an in-depth analysis and looking at all the angles. Getting distracted by the obvious things might lead you to the wrong conclusion. Yet, at cognitiveSEO we are trying to make the penalty identification process as easy as possible because we know how important this matter is for our users.


Easily Spot Historic Google Penalties using the SEO Visibility Metric


With the integration of the SEO visibility chart, we take the “penalty identification” process to another level. Every time you’re trying to figure out what happened to a certain website, the first chart you should look at is the one with its visibility drop. Just by taking a glance at this chart you can easily see whether there are significant downhills in a website’s history that can reflect a Google “red flag” or whether that site’s online visibility is stable and doesn’t have dramatic ups and downs. What is very important to mention is that you can not only see the current situation of a site but also check out for penalties or boosts in a historical trend.


If you are a reader of our blog, you might be familiar with some of our case studies regarding penalized sites. If you are not a fan of our blog (with the hope you will become one :D), let me tell you that the SEO visibility and the “dropped”/ “improved” keywords charts are one of the most important graphs that we use in our in-depth investigation of a site’s situation. Here are some of our case studies where you can see the exact applicability of these charts and their importance in accurate penalty identification.


2. Analyze Competitors and Compare Yourself


Competition is always a good thing. It forces us to do our best. However, knowing your competition is even better. Understanding your competitors’ previous decisions regarding their strategies sheds some light on what gave them an advantage or what backfired.


SEO Visibility Competitor Comparison


You know exactly where you’re standing in your niche as you now have the possibility to make direct comparisons with your competitor, regarding your SEO visibility. Moreover, analyzing the evolution of the most important keywords for you and your competitor will give you the possibility to discover which keywords drive more visitors and which ones are those which pull the traffic down.

Having the possibility to make a side-by-side comparison and knowing the exact figures of the SEO visibility and keyword ranking fluctuation in your niche keeps you one step ahead of the competition.


3. Watch for Future Problems


One of the ways that can guarantee you being a top player in your niche is to be constantly up-to-date with how your site and your competitors’ sites are doing from a SEO point of view.

Thanks to our fully customizable dashboard, you now have the possibility to set a SEO visibility widget and closely monitor it. Having the visibility widget on your dashboard keeps you permanently connected to your site’s situation, helping you to avoid future problems at the same time or quickly solve the existing ones.


Ongoing SEO Visibility Monitoring from Searchmetrics


You will have an idea of your SEO visibility at just a glance. Therefore you will save lots of time and resources. Furthermore, you can make business decisions based on real time information as the SEO visibility and the keyword ranking graphs are updated weekly. This way, you can spot whenever an issue occurs and you can react more quickly. If you are to experience some massive drop in rankings or unnatural link activity on your site, just by taking a peek on the dashboard you will be able to react accordingly. I remind you that the dashboard is sharable, meaning that your team or your clients can consult, with your permission, the SEO visibility anytime they want.


You can see how being connected to the SEO visibility graph helped us catching the exact moments when some important sites where penalized and understand what exactly happened to those businesses.


4. Pitch New Clients Based on the Historic Performances


Win New Clients by Spotting Search Engine Penalties


Our marketing communications world is changing at a rapid pace and pitching new clients can be quite a challenge sometimes. Yet, having a strong competitive advantage in front of other agencies can be a real breakthrough. Not only you can present your client the opportunities that he has on the market or dissect his competitors’ SEO strategies but you can now have instant access to his historical performances. You can notify the possible client if he is on an uptrend or a downtrend and for how long, you can easily explain his evolution by comparison with his competitors’ and, last but not least you can spot a past or a present penalty. All these data is top quality info and you might be the only one who can provide this sort of substance.


Adding the SEO Visibility to Your SEO Dashboard


Having the SEO Visibility widget within your reach brings a lot of benefits, as I’ve mentioned before. So, knowing how to set the widget on the dashboard is important in order to take advantage of all those benefits.


Add New SEO Visibility Widget


First of all, on the dashboard, you need to press the “new widget” button and choose from the widget type the “SEO visibility” one. Once you’ve done that, you can choose the website you want the widget to give you information on and also the country you want to see the visibility from. Once you’ve done that, you are set to go and monitor the SEO visibility at just a glance.


Several Other Improvements Worth Mentioning


Along with the updates that I’ve just presented you above, we also have some other not-so-big but important improvements that we’ve done to the tool. As I was mentioning at the beginning of the article, cognitiveSEO’s team is a very dynamic one, always aiming for perfection and happy customers.

Therefore, let me list some of the improvements we’ve made lately:


1. Page Customization


Customize Your Toolset cognitiveSEO


You can now choose to see what is most important to you in matter of charts and graphs. You can decide to visualize all the charts of the analysis or you can easily choose to see only the ones that matters the most to you and hide the others.


2. PDF Customization


Create Customized PDFs Export cognitiveSEO


Your reports will now look exactly what you want them to look like, containing only the information you want. You can download fully customized reports and present your co-workers or your clients with the whole analysis or with the charts and graphs that you choose only.


3. Better Crawling Algorithms


For a more efficient process, in order to not block the crawled sites we integrated some smarter crawling technology to crawl some of the sites that may present issues with our previous crawler.


4. Several Improvements Added and Bugs Corrected


Improving cognitiveSEO’s overall functionality is one of our main concern. Like any other tool, bugs or several issues may appear along the way. Yet, we fixed the reported bugs, we optimized the dashboard so it now loads faster and we made slight adjustments where it was the case.


Conclusion


Hope you will be as excited as we are with these great new features and you will take full benefit from them. We strongly believe that they will impact your business positively and will bring added value to your work. The SEO visibility and the keyword fluctuation rank charts are making cognitiveSEO even a smarter tool, providing you with a 360 degree view of the digital market context. cognitiveSEO is not just a SEO tool but a real help for every business, a complex digital marketing platform that offers now more benefits for the same price.


 


The post Searchmetrics SEO Visibility Gets Fully Integrated in cognitiveSEO appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.



Searchmetrics SEO Visibility Gets Fully Integrated in cognitiveSEO

The Ultimate Guide To Google Webmaster Tools

google webmaster toolsWARNING:  If you are not monitoring Google Webmaster Tools, this is damaging to your business!


Do you know how essential it is to have Google Webmaster Tools installed?


When Google crawls through your site, it produces a report and gives it to you for free.  This report is available through Google Webmaster Tools and you can find out what Google likes and doesn’t like about your website.


Wouldn’t you like to know this information?


In this guide we go step-by-step through all you need to know about Google Webmaster Tools.


Why should you install Google Webmaster Tools?


Here are five good reasons to install Google Webmaster Tools:


1.  It shows you which keywords are driving traffic to your website.  Imagine if you saw that you were getting 1,000 visitors from a particular set of keywords and you had no blog post based on that set of keywords.  In this case, you could write a blog post specifically on those keywords and get a ton more traffic.


2. Imagine if Google took a look at your site and found that there were lots of dodgy links coming in to your site, wouldn’t you like to know that Google has identified a problem?


3.  When Google is indexing your site, it checks out your meta title (a one-line description of your page) and your meta description (a more detailed description).  If Google finds that you have duplicates then it’s harder to index the content correctly.  Wouldn’t you like to know if this is the case?


4.  Every site should have a sitemap, which describes all the content on your site and tells Google how often to index it.  If Google has problems with this sitemap, wouldn’t you like to know about it?


5.  Is your content strategy working?  If you are writing great content, the amount of your content that appears in search results should be going up all the time.  Wouldn’t you like to know this?


How do you install Google Webmaster Tools?


You will need a Google account, which you probably have already.  Once you have an account, go to to the Google Webmaster Tools set-up page  and click ‘add site’.


 


Add site Google webmaster

Add your site to Google Webmaster Tools



 


You will need to verify that you are the owner of the site.  Google gives you a few options to prove that you own your website:


a) Wherever you have registered your domain, you can add some information to the configuration that shows you that you own the domain.


b) Upload a HTML file.


c) Link it to your Google Analytics account.


d) Add some information to your home page.


e) Link it to your Google Tag Manager account.


Once your site is ready, you are up and running.


So, let’s step through all the sections of Google Webmaster Tools.


Site Dashboard


The site dashboard gives you an initial overview of the health of your website.  This is a summary and, when we go through the menu items, you can see that more details are provided for each section.


It contains three sections:


Current status


This is the first image you’ll see on the list.  You immediately want to see three green ticks:


DNS – When Google looks up your site, it needs to connect with a DNS (domain name server), which tells Google where to go to find your website.  If you see any errors on the DNS, that may be because Google has a problem accessing your DNS server.  It could be that it can’t connect with it at all or that there is a delay.  Either way, if you don’t see a green light, then you need to contact your hosting provider.


Server connectivity – If Google gets past your DNS but can’t then connect with your server, that’s another issue.  You don’t want your site to be down and you don’t want a super-slow website.  Get on to your hosting company, upgrade your hosting or move to another hosting provider if this happens regularly.


Robots.txt fetch – Normally, every site has a robots.txt file.  This gives Google some instructions on what to index and not index on your site.  It should also have information for Google about your sitemap (see later explanation).  It’s not essential that you have one but it is advisable.


For example,  here are two typical lines in a robots.txt:


  • Disallow: /wp-includes/ – This means don’t index anything in this directory

  • Sitemap: http://www.razorsocial.com/post-sitemap.xml – Here is the sitemap, i.e. you’re telling Google where it is.

So, it’s important that Google can access it correctly.  There could be areas of your website you don’t want indexing at all, and there could be one or more sitemaps you want to tell Google about.


 


Dashboard crawl errors

A snapshot of problems



 


Underneath this, you will see any errors that Google has found.  There are generally some errors displayed here and some are not of concern, but you do need to investigate them.  Here’s an explanation of each section:


  • Server error – If Google can’t access any URLs, you’ll see an error here.

  • Not found – This means that your server returned a 404 error, which means the page is not there any more.

  • Soft – A 404 error is returned when a page is not found – instead, you could assign a ‘200’ error and, instead of this page, send the home page to the person browsing.  However, Google does not like this so it’s not recommended.

  • Not followed – This is where Google can access your page but can’t read your content. This is an issue for your developer to resolve.

  • Other – If Google finds other issues that don’t fit into the categories above, they will list them here.

‘Not found’ errors


This is where you will generally have some issues listed, but a lot of the ‘not found’ errors could be valid.  Here are some things to consider:


  • Deleted pages – You may have created a page and then deleted it when you were finished with it.  If Google hasn’t updated the index yet, it will still attempt to access that page and return an error.  This should correct itself.

  • Pages that shouldn’t be indexed – Maybe there’s a page that Google has indexed when it shouldn’t have.  You need to check your sitemap to see if it’s in there, and you may need to update your robots.txt to make sure Google doesn’t have access to the directory where the page is.

  • The page is gone – Maybe it’s a genuine problem where the page has been removed accidentally, in which case it’s time to retrieve the file from your backups.

  • Linking to something invalid – Google finds a page and then follows links on that page.  If your links are broken then that’s an issue.  One thing you should definitely do on a regular basis is check and fix any broken links.  The best tool for doing this is Broken Link Check, which will tell you which links are broken.  If people click on links that go nowhere that creates a poor user experience, so it’s good to do this check on a regular basis.

Note: 404s don’t affect your site’s rankings but it’s still good to clean them up.


You can also go through each broken link, one by one, in Google Webmaster Tools.  You click on the broken link, select the ‘Linked from’ section, and this will show you the page that links to the broken link.


 


Link not found

Find out which page links to this



 


Site Messages


This is where Google gives you information about any problems it finds on your site.  Ideally you don’t want to see anything in this section.  But you could see messages related to:


a) Your site has been hacked


If Google thinks your site was hacked, you’ll see a warning.  The first thing you need to do is talk to your development team and/or hosting team to do an analysis and resolve this as soon as possible. Google is not going to send you a lot of traffic (if any) if it thinks your site has been hacked.


b) Unnatural links pointing to your site


It’s bad if you link out to dodgy sites, but it’s also bad if you are getting links back to your site that look unnatural from Google’s point of view.  Unnatural links could be a high volume of links from sites that Google has identified as spammy.


It’s important to be notified of these messages.  Select the ‘preferences’ option at the top right of your screen and make sure notifications are switched on.


 


Site preferences

You really want to get an email if there are issues!



 


Under the ‘type’ section, you can specify whether you want to notified of all messages, or just the top issues.


Search Appearance


The default screen displayed is the dashboard screen that we explained earlier on in this document.  This shows three sections – crawl errors, search queries and sitemaps.


There are also some additional menu options:


Structured Data


Structured data is adding additional information to your web page to describe what is on it.  You use a specific format, available on Schema.org, or you can use Microdata, RDF or Microformats.  They all do pretty much the same thing.


Why would you use structured data?  


Here’s an example:


I searched for ladies’ shoes on Google.  I don’t normally search for ladies’ shoes, but I thought it would be a good example!


You can see that there are star ratings added to this listing.  This makes the listing stand out on Google and, with a high rating, the click-through rate will be higher.  This is microdata.


 


Example microdata

Here is an example of additional information provided in search results



 


This section tells you if you have any errors with your microdata that need correcting.


Data Highlighter


After reading the previous section, you may be thinking that structured data is a bit complicated.  It is, but not for a developer.  However, the data highlighter is functionality that is provided by Google to make it easier to add structured data to your content.


This is how you do it:


Enter the web address of your site and specify what type of page it is.  For example, it is a page all about book reviews.


In our example, we picked articles.  When you use highlighter, you can decide if you want Google to tag just this page with the additional information, or if you want Google to tag all similar pages.  This saves you performing the same process for hundreds or thousands of similar pages!


 


Data highlighter

Data highlighter configuration



 


In the following example, I have highlighted the title of the post and then selected ‘Title’.


 


Highlighting parts

Highlight relevant sections and tag



 


As you are highlighting sections you’ll see that, over on the right-hand side, Google displays what you are tagging.  In this example, I highlighted the blog title and an image within the blog post.


 


Data highlighter

As you highlight and tag items, Google will display them to the right



 


Your best option for adding this structured data to your content is to code it within your pages (e.g. using Schema.org). Using the highlighter is not as good, because this is specific to Google only.  If you don’t add the code then other search engines or products outside of Google will not see this structured data.


However, Google is probably your number one target and, if you can’t afford to get a developer to do it and you want it done quickly, this is a good way of doing it.  For example, imagine you had a website with 1,000 pages and each page covered a product and had reviews of it.  You might want to use the highlighter as a quick way of telling Google about the reviews.


HTML Improvements


This is a useful area because it displays any issues with your content that would affect your ranking, so resolving all issues is worth your while.  Here are the sections it covers:


  • Meta Descriptions – When you search Google, the first line you see in the search results is the ‘meta title’, and the next couple of lines are the ‘meta description’.  It’s best to keep this content between 150 and 160 characters. If you have descriptions that are too long, they won’t be displayed by Google and, if they are too short, there may not be enough information for the browser to decide if the article is relevant to them or not.  It’s also not good if the meta descriptions are exactly the same for two different articles.  So, look at the errors provided by Google and resolve them!

  • Meta titles – When Google is indexing your content, it looks at a tag called a title tag.  Google will normally display 50 to 60 characters of a title tag; it’s not an exact number of letters because, if you have lots of narrow letters in your title, more characters will fit in the same space!  As the title tags are used as part of Google’s indexing algorithm, you don’t want to mess with them.  Make sure there are no duplicates, that they are the right length, that they are not too long etc.

  • Non-indexable content – If Google finds content that it can’t index (and you didn’t specifically ask Google to not index it), you will see a list in this section.

 


HTML Improvements

Identify and resolve any issues displayed



 


Sitelinks


If Google views your site as an authoritative site then, instead of just showing a one-line listing for your website, it will show some links to high-authority pages on your site.  This is great, because your listing will appear a lot bigger so it will look more important and people will be able to click directly onto an article of interest.


 


Site links

These links are automatically retrieved and displayed by Google



 


You cannot ask Google to create sitelinks for you if they are not already displayed; Google makes that decision.  You also cannot ask Google to add on a sitelink to a page you think is important.  The only thing you can do is demote a link and remove it from the listing.


 


Demote a link

Remove a sitelink that you do not want



Search Traffic


This provides a wealth of information related to search on your site.


Search queries


You can view this in ‘top queries’ or ‘top pages’.


Top queries


This shows you a list of the keywords that are driving you the most traffic.  Here’s how it looks:


 


Search queries

View the top queries for your site



 


Here is an explanation of each of the columns:


  • Query – The keywords used to find your site.

  • Impressions – This is the number of times a web page on your site was displayed as a result of your keyword search.  If someone searches, and you are on page 2 for that search, it is counted as an impression when someone goes to page 2.

  • Clicks – This is the number of times someone clicks on your link in the search results.

  • CTR – This is the click-through rate, which is the number of impressions divided by the number of clicks.

  • Avg. Position – For an impression, what was the average position you appeared in search results?

What to do with this information


Here are some things to consider:


1. Are there any search queries that are not relevant to your site?  Sometimes, you end up ranking for content that is not relevant at all to your business.  Imagine if you had a website about gardening and you wrote a post about ‘Rihanna’s habits for gardening’.  If you are ranking for ‘Rihanna’s habits’, that is not going to be relevant.  It’s ok if there are a couple of keywords like this that don’t make sense, but you certainly don’t want a load of them.


You want Google to understand what your site is about.  If you are ranking for invalid keywords, find out the offending page and start changing the content so you are more likely to rank for relevant keywords.


2. Is your click-through rate low for relevant content? Imagine if you had really good content that was found regularly but the click-through rate was low.  To improve the click-through rate, you may update the title and description so it’s more relevant and also more enticing.


3. Are you ranking for keywords without any focussed articles for those keywords?  You might find that you are ranking for keywords you never tried to rank for, and maybe your average position is on page 2 or further down.  If you’re already getting traffic for keywords on page 2 and you haven’t targeted these keywords, it’s highly likely that you will get more traffic if you write an article really focussed on those words or phrases.


4. Are you getting clicks but your article is not appearing on page 1?  If you have an article focussed on particular keyword terms and you’re getting traffic, but your average position is on page 2 or lower, then you may want to optimize this post further.  This means reviewing and updating the content, building links from other pages on your site or building links from external authority sites.


Tip: If I had an article on ‘Facebook Contests’ and I wanted to find other articles on my site I could link to, I’d go to Google and type ‘Facebook contests site:www.razorsocial.com’.


As part of the top pages, you can also view a graph of queries, impressions and clicks based on those impressions over a specific time period.    Of course, ideally you want to see this going up all the time.


But the most important figure is clicks.  Impressions are pointless unless people are clicking on your content.


 


Query Graph

View the queries over a period



 


 


Top Pages


Instead of viewing by top keywords, you can also view by top pages.  There is an option here to view the different search results, clicks etc. for each page, based on the previous period.  It is very useful to look at this.


In the example below, we have highlighted where we update this setting.


 


search web pages

View a comparison over the previous period



 


If you notice over on the left hand side of the image above you see a triangle symbol.  Click on this and Google will show you all the search queries used to send traffic to that blog post.  This is really interesting information!


 


Links to your site


This shows you the links from external sites to your site.  You can drill down and pick pages on your website to view all the links from those pages.  Here’s an example:


 


links to your site

View a list of links by pages



 


It’s useful, every so often, to do some link analysis.  You can export all the links and view them in an Excel spreadsheet where you can do further analysis.  One area to focus on is to identify who is linking to which blog posts, and also look at which posts get the most links.  Why waste time on content that people aren’t linking to?  If they are not linking to your content then it’s unlikely you will rank for the content.


You’ll probably find that the articles with the highest amount of links are the the long, detailed articles, infographics and group posts!


Although you can do some analysis here, my preferred tool is Ahrefs, which gives me more details related to the links e.g. the value of each link.


Internal Links


There are internal links where you are linking to content on your site.  I don’t find this section particularly useful because menus appear on every page so it looks like your posts are linking to every menu item on every page.  This means you are shown tons of links so it’s harder to find out which ones are relevant.


What is important is that you have an internal linking strategy to link relevant posts together.


As mentioned earlier, on a tip to find posts worth linking to, go to Google and type the keywords of the post you want to link to and then add your domain name using the ‘site’ command as follows:


“Facebook applications” site: www.razorsocial.com


You’re looking for incidences of the term ‘Facebook applications’ being mentioned in any posts within razorsocial.com, these posts are relevant so it’s a good place to link to your post.


Remember, you can link old posts to new posts and new posts to old ones!


Manual Actions


When you arrive on this page, you want to see a nice message saying ‘no manual actions found’.


Manual actions are penalties imposed by Google on your site because they have found something on your site they don’t like.  Manual actions can be site-wide matches that affect your whole site, or partial matches that affect parts of your site.



When you see that an action has been applied, you’ll see the reason for the action and how much of the site has been affected.


The types of manual actions could be:


  • Unnatural links to and from your site – Make sure you remove these links. If external sites that Google doesn’t like are linking to you, reach out and ask them to remove the link.

  • Hacked site – Get on to your hosting provider and/or development team to resolve this ASAP.

  • User-generated spam – Spam content on your site.  Get your development team to do an analysis and remove it.

  • Content with little or no added value – Remove this content and start focussing on higher quality content.  You might have thousands of pages with very little content on each page and it looks like you just created the pages to rank for the content.

  • Pure spam – Get rid of any pure spam content.

  • Spammy freehosts – A significant percentage of pages on your site are spammy, remove pages.

  • Spammy structured markup – If you’ve added data to describe your content but Google thinks it looks spammy, you will either have to fix it or remove it.

  • Hidden text or keyword stuffing – This was a bad practice years ago where extra content was added to rank for keywords.  Remove this asap!

  • Cloaking or sneaky redirects – You are showing different pages to Google than what you show to your website visitors.  Change this asap!

When there are manual actions taken on your site and you resolve the issue, you can send in a request to Google to get the issue sorted.  For example, if your site was hacked, once you resolve this Google will restore your previous listings.



International Targeting


There are two options here:


a) Language targeting


If you are targeting people in different countries, you will need to point people to a page with the relevant language.  In this section, any language targeting will be displayed.  Google will look at your website pages and try to find a line like this:


<link rel=”alternate” hreflang=”x” href=”alternateURL“>


This will tell Google if you are doing language targeting.


b) Country targeting


In Ireland, the domain extension is .ie, so if Razorsocial was focussed only on Ireland, we would be razorsocial.ie.  That is a country-specific extension and Google automatically assumes you are targeting Irish visitors.


But, if you have a non-country-specific domain like .com or .org, you should tell Google which the most important country for you for ranking is.


 


International targeting

Specify the specific country you want to target



Google Index


This gives you information related to the indexing of content by Google.


Index Status


This will show you the number of pages on your website that are indexed by Google, as well as the ones that are blocked from being indexed.


Ideally, you want to see the graph going up as you write more content.  Google never indexes 100% of your content but you’re looking for a number that is very close to the total number of pages on your site.


If you go to the advanced section, you can also see the total number of pages that have been blocked.


 


Index Status

View a summary of the index status



 


Robots.txt is a text file that is normally found at the root of your website (e.g. www.razorsocial.com/go).  In this file, you give Google bots (software that comes to your website to index it) specific restrictions and information about what to index and what not to index.  There may be a membership site you don’t want indexed, specific parts of your WordPress install, privates web pages etc.


If you think there are too many pages being blocked, you need to investigate your robots.txt to make sure it’s blocking the right content.


If you feel that not enough pages are being indexed, you need to start looking at your sitemap file to make sure you are telling Google to index all the right content.


Content Keywords


This is where Google does an analysis of your website to see which words are appearing most frequently.  You want to make sure that these keywords are relevant to your site.  If you find that spammy keywords are appearing here, you can click on the keyword to find out which articles are using these spammy keywords and then you can remove them.


 


Content Keywords

See what Google thinks your site is about



 


Remove URLs


If there is a URL that Google is indexing, you can submit a request for Google to remove that URL.


But be careful.  Google specifically say that this should only be used in emergency situations where something like confidential information has been exposed.  It clearly states that if you use this for other purposes it may affect your site.


So, don’t use it if it’s a page that is giving a 404 error (page not found), or just any page that shouldn’t be indexed but is not doing any harm.


Really, you should only use this in an absolute emergency.  Here are Google guidelines -> Click here.


However, before you do this you need to try and get rid of it yourself:  You can make sure your sitemap is rebuilt so that it’s not longer in the sitemap.  If it’s a directory with only a few pages, you could block Google’s bot from indexing the content by adding a relevant line in the robots.txt file.


 


Crawl


Does it feel creepy that Google crawls through your website?  It is creepy, but I welcome it!


Google sends out its bot to look for new content, but it’s friendly enough to tell if you if it finds any problems, which it breaks down by desktop, smartphone and feature phone.


In the list below, there are a few errors worth investigating and resolving.


 


URL Errors

Find and resolve any crawl errors



 


For example, the first error is a server error and this is related to a page not found.  I clicked on the link and, on the page below, I can ‘fetch as Google’.  The page was returned fine so I just marked it as fixed.


 


 


Server error details

Mark the issue as fixed if it’s no longer an issue



 


Crawl Stats


The crawl stats show you the number of pages that are crawled per day, the total content downloaded each day (to analyze for indexing) and the time spent crawling the pages.


Ideally, you want the number of pages crawled to be going up as you add more content.  If it isn’t, then you need to start investigating this (clue: start with the sitemap).


If the time it takes to crawl your site is going up a lot, that’s a cause for concern.  Google wants a fast website, so you need to make sure it’s not taking too much time for Google to download content.


For example, your server could be slow at certain times of day and this could be when Google is crawling it.  This may mean you need to change servers.


Fetch as Google


This allows you to ‘fetch’ your content and see it as Google sees it.  If Google has problems accessing your content then you will need to resolve this.


You can fetch it as code or fetch with it fully rendered (i.e. displayed fully, not in code).


Robots.txt Tester


If you want to see if any page is blocked when it shouldn’t be, in your robots.txt file, you can enter the web address to test it out.


Sitemaps


The sitemap is where you tell Google about all the content you have on your website and how frequently you want it to be indexed.  There’s no point in getting Google to index content every day if it doesn’t change that often.


You will notice that you’ll never get everything you submit indexed, but if the number of submitted pages versus indexed pages is very close in number then you don’t have a lot to worry about.


 


post sitemap after

Very close to 100% indexing



 


If you don’t have a sitemap, there are various tools that will help you create one.  For example, WordPress SEO by Yoast creates and updates the sitemaps required (and you can have multiple sitemaps for different areas of your site).


URL Parameters


Sometimes, you can have the same content on your website but it is tagged differently at the end of the page name.  Google will normally figure out whether it’s the same content or not, but if it can’t work it out you may need to provide it with some help.


Be very careful with this section – you’re probably better avoiding it because it’s super techie!


Security Issues


If there are any security-related issues found by Google, they will be listed here.  Obviously, you will need to resolve any security issues as soon as possible.


Summary


Google Webmaster Tools is an extremely important item in your tool box.  Google tells you of any issues it finds when it crawls your website, and because Google probably sends you the most traffic, you need to pay attention.


Have you installed Google Webmaster Tools yet?


Do you look at it regularly?


Will you install it after reading this post?


Thanks for reading!


Ian


 


Map image by Shutterstock


The post The Ultimate Guide To Google Webmaster Tools appeared first on RazorSocial and was written by Ian Cleary



The Ultimate Guide To Google Webmaster Tools