Friday, January 26, 2018

What's the difference between organic vs. paid results?


Organic results are the results that appear in search engines, for free, based on an algorithm. Paid -- or inorganic -- search results appear at the top or side of a page. These are the links that advertisers pay to appear on different search engines.

paid-vs-organic

Thursday, December 14, 2017

Keyword Analysis: How to Analyze Your Keywords Effectively

What Is Keyword Analysis?

Keyword Analysis: How to Analyze Your Keywords Effectively
What Is Keyword Analysis?
Keyword analysis is the process of analyzing the keywords or search phrases that bring visitors to your website through organic and paid search. As such, keyword analysis is the starting point and cornerstone of search marketing campaigns.

By understanding what queries qualified visitors to your website type into search engines, search marketers can better customize their content and landing pages to drive more traffic and increase conversion rates. For this reason, keyword analysis is an important skill for both SEO and PPC experts.

Keyword analysis helps to increase conversions, find new markets, and optimize spend, but it requires time-consuming examination and decision making to beat your keyword competition. The WordStream Keyword Analysis Tool takes the analysis of your website keywords a step further by not only analyzing your keywords but also suggesting actions and automating your activity for the best efficiency and results.

Do you really want to look at spreadsheets and graphs for hours a day every day? And after that, what comes next? WordStream eliminates this time waste, streamlining the process of analyzing keywords, highlighting the vital marketing performance metrics, and prioritizing actions to greatly improve your efficiency while simultaneously improving your PPC performance.


The Importance of Keyword Analysis

Marketing is inherently analytic. Field-testing marketing outreach and marketing performance is key to optimizing the budget allocation and market reach. Search marketing is no different, and since keywords dictate your entire search campaign, keyword analysis should be your primary focus. Analyzing keywords allows you to:

Optimize Spend: Distribute more budget to successful keywords and eliminate wasteful spending on those that aren't producing results

Increase Conversions: Identifying and focusing on well-converting keywords is good for conversion rate optimization and return on investment (ROI)

Eye Trends: Knowledge of keyword search frequency provides insight into market behavior which you can apply to multiple aspects of your business

Prioritize Your Time: Keyword performance guides campaign importance--spend your time optimizing areas that have the biggest impact on your bottom line

Find New Markets: Use keyword analysis to expand your long tail efforts and discover more specific keyword queries and corresponding warm leads

Despite all the benefits, most search marketers don't spend nearly enough time on keyword analysis because it's time-consuming and repetitive. With keyword software from WordStream, analyzing keywords and capitalizing on data is automated and simplified, making you more productive.


Keyword Analysis With WordStream

Google AdWords includes a keyword analyzer and reporting section that details data like keyword clicks and campaign success. However, it's difficult to examine keywords and how they relate to each other from inside these tools.

Though Google's tools are helpful in obtaining an overview of your account, what you do as a result of that data and how you learn from it to improve your performance is totally up to you. In addition to the advanced keyword search capability, WordStream evaluates keyword popularity and visit totals to segment keyword groups for you. Highly relevant keyword groups are a crucial step towards improving your Quality Score, and WordStream's learning software ensures your keywords are segmented as relevantly as possible. The software determines the best segmentation based on its own keywords analysis as well as any additional filters and rules that you define.

Keyword Analysis : how to analyse and choose relevant keywords

The keyword source analysis feature allows you to analyze search engine keywords in aggregate or drill down to focus on just paid keywords, just organic keywords, paid and organic keywords together, or just keyword suggestions and associated traffic estimates. This feature enables you to perform SEO-driven and PPC-driven keyword analysis with the same tool.

WordStream software is fully dynamic, so keywords people enter to find your specific website are added to your account daily: your own private, automated keyword discovery tool! Existing keywords will be updated with the most recent visit data, and new keywords will flow through to groups based on the rules you set while segmenting keywords. In other words, your keyword segmentation suggestions, long tail metrics, and visit tallies are always automatically based on current data.


Keyword Analysis Steps

What's next, you ask? Follow the guidelines below to fully analyze and profit from your keyword data:

  1. Sign into your WordStream account. If you don't yet have an account, register for a Free Trial here.
  2. If your account is new, your first step is to use the keyword discovery tool to parse your web files for your complete keyword history to populate your WordStream account. Alternatively, import keywords using WordStream's keyword suggestion tool.
  3. Segment keywords by relevancy with WordStream's automated grouping tools.
  4. Use WordStream's negative keyword tool to create negative keyword suggestions and eliminate irrelevant keywords.
  5. Keep an eye on your long-tail keywords as they flow in for landing page and SEO ideas.
And that's it! Perhaps your next step is to determine how you want to spend all your new-found free time!


Get a Free PPC Keyword Analysis with the AdWords Performance Grader

Wordstream’s AdWords Performance Grader is a comprehensive free tool that helps you evaluate how your AdWords campaigns are performing on several key criteria, such as:

  • Long-tail keyword optimization
  • Effective use of negative keywords
  • Quality Score
long tail keyword analysis

The Performance Grader allows you to compare your score with competitors in similar spend brackets, giving you a sense as to where you fit into the competitive landscape.  

The AdWords Performance Grader shows you where your campaigns are struggling and how to make adjustments that will improve your performance and increase ROI. It’s an expert analysis, and it’s absolutely free!



Tuesday, December 12, 2017

Keyword Research Methodology in SEO

Any good SEO company or consultant is going to tell you that keyword research may be one of the most important things you do for your SEO. The keywords you target will directly affect what searches you rank for and what kind of traffic will be directed to your site. Failing to include the right keywords means missing out on potential traffic.

Step One: Read Your Content

Keyword research is done on a page-by-page basis. Search engines rank individual pages, not sites as a whole. Each page will have its own set of keywords that directly relate to the content of that page. Before you start your keyword research, read each page carefully and take notes about what that content is about.

Step Two: Check Your Analytics

Before you start changing keywords, you need some idea of what ones currently work and don’t work for each page. You should have signed up for Google Webmaster Tools (if you haven’t you really need to), which can help you determine what keywords are leading people to your site and which pages they are coming in through. This is important because you don’t want to accidentally remove keywords that were helping drive traffic to your site.


Step Three: Use a Keyword Research Tool to Create a Keyword List

Once you’ve determined what your content is about and what keywords are currently working for that page, you need to come up with several related variations of that keyword. Different people will search with different terms, and you want to make sure your site shows up for those searches as well.

3A: I’m a fan of the Google Keyword Research Tool .This (free) tool uses Google data to determine related searches. It also shows you the search volume for those variations. Keywords with a high search volume are more competitive.

3B: For this example, plug “wholesale restaurant supply store” into the Keyword Research tool. If you aren’t logged in to an AdWords account, Google will only display 100 variations. (If you are logged in, you’ll see that there are 800 variations!). Make sure you’ve selected the United States and English as your parameters.

3C: Above the keywords you have the option to download the list. You want to export them to Excel, so choose the CSV for Excel option.

3D: The list will have 4 columns (keyword, competition, global monthly searches and local monthly searches). You only need the keyword column and local (meaning United States) monthly searches.

Step Four: Scrub Your List

Scrubbing is the most important step of keyword research. Obviously there is no way you can target 800 different keywords, so you need to trim your list way down. Going keyword by keyword, you need to determine if that keyword is relevant to the content on the page! Here is a sample of a scrubbed and non-scrubbed list for “wholesale restaurant supply store.”

Non-scrubbed
wholesale restaurant supply store
restaurant supply
restaurant wholesale supply
wholesale restaurant equipment
restaurant supply store
restaurants supplies
wholesale restaurant supplies
restaurant supply stores
restaurant supply los angeles
restaurant supplies
restaurant supply dallas
wholesale restaurant supply
restaurant supply san diego
used restaurant supplies
restaurant supplies and equipment
restaurant supply portland oregon
restaurant equipment
used restaurant equipment
restaurant equipment supply
used restaurant equipment for sale
restaurant supplies store
restaurant supplies dishes
japanese restaurant supply
restaurant equipment leasing
restaurant equipment supplies
hotel & restaurant supply
restaurant supplies wholesale
restaurant food supply
restaurant supply company

Scrubbed
wholesale restaurant supply store
restaurant supply
restaurant wholesale supply
wholesale restaurant equipment
restaurant supply store
wholesale restaurant supplies
restaurant supply stores
wholesale restaurant supply
restaurant supplies and equipment
restaurant equipment supply
restaurant supplies store
restaurant supplies wholesale

Now the scrubbed list is still a little too long, so you need to trim it down even more. A good rule of thumb is to go after 2-5 keywords per page.

Do this process two or three times per page, changing up the keyword each time. For instance, the second list I create would use “restaurant equipment supply store” and the third list would use “restaurant supplies distributor” as the starting keyword. After scrubbing those lists down, you can create a final master list that has the best and most relevant keywords from all three variations. Now you need to scrub that master list down to your best 2-5 keywords.

Thursday, December 7, 2017

Types Of Keywords in SEO

There are mainly three types of keywords:

When trying to create content that’s friendly to both organic search and paid search it’s important that your digital strategy utilizes keywords from these three main buckets below. Each type of keyword has their own pros and cons but wrapping these all together in one solid keyword strategy will yield the strongest results.

  • Generic Keywords
  • Broad Match Keywords
  • Long Tail Keywords

Generic Keywords


Generic Keywords


Just as the title suggests these are very generic, unspecific terms that get searched for. Something like “Tennis Shoes” or “Digital Cameras” would be considered a generic term. When developing an organic search strategy we typically stray away from these terms as they are highly competitive and not specific enough to the sites actual content. However, if you are able to rank for a generic keyword your site should receive a decent amount of traffic from that term. Conversions for that term might be a little low as a user is hitting your site for a very generic overarching topic and nothing too specific.

When running an AdWords campaign its nice to integrate some of these generic keywords to make sure that every opportunity is covered. Due to the competitive nature of a generic keyword they will cost more per click. If the ad ranks well and receives a good Ad Score from Google, decent traffic might follow. Just as with organic search, once a user gets on the website from the search engine, conversions will most likely be low for this term.

Generic terms are a tough decision to pursue and I tend to avoid them unless I have the right site with the right content and the right promotion budget behind it.

Broad Match Keywords


Broad Keywords



Broad match terms are the core of SEO. Terms like “Red Tennis Shoes” or “Canon T2I Digital Camera” will present a stronger opportunity and engagement than a generic term. Optimizing for broad match terms will provide good traffic with not as much competition. A broad match searcher has a specific item/content that they are searching for and optimizing for these type of terms will provide an average amount of conversions.

Broad match terms are right in the middle of things and are highly recommended due to moderate competition/cost and click through rate. A site that bases the majority of its content around these type of terms should perform pretty well.


Long Tail Keywords


Long Tail Keywords

The last of these three types of keywords to consider is the long tail keyword. Think of these as the sentences that get typed into Google. Something like “how do I set the aperture on my Canon T3I digital camera” would be considered a long tail keyword. Long tail keywords might not be the biggest traffic drivers to your site but if you rank for a long tail term you will get traffic due to its specific nature and low competition. From an AdWords standpoint, these terms will be the most affordable but traffic might not be as abundant. However, conversion rates for these terms should be stronger than generic or broad keywords.

The meat of a strong keyword strategy will reside within the broad match keyword but long tail and generic terms should be integrated from both a SEO and SEM perspective to maintain a balanced approach to your search marketing ecosystem.



Tuesday, December 5, 2017

SEO Competitive Analysis & Research

A potential client once asked, “I’d like to rank for the word jewelry.”

“OK,” I said, “And what’s your budget?”

His answer was not surprising to me. “Weeelll, I can spend about $500/month on organic SEO.”

SEOs all around the world are used to having similar discussions on an almost daily basis, trying to explain to potential customers the balance between search volume, domain authority, brand authority, and budget.

In this article, we’ll explore how to perform competitive research in order to formulate a clear, realistic, and cohesive strategy.

How do keyword research and competitive analysis go hand in hand? What is the barrier to entry for my industry? What are terms that my company can realistically rank for, and how long will it take? Let’s explore.

Step 1: Keyword Research

The first step is to first identify the keywords that your site can REALISTICALLY target. For this article, we’ll use “jewelry” as an example. Unless you have the brand authority or budget to compete with companies such as Tiffany.com and Kay.com, the possibility of ranking for the keyword “jewelry” for any small or medium business is practically none.

Instead, your first goal is to find a viable niche within this competitive industry that is achievable. Mindmapping is a great tool to organize your keyword concepts and buckets. Here’s an example of how this process would work:

competitive-research-2

There are a myriad of ways to grow your keyword list:


  • Identifying your competitors and typing them into SEMrush to see the list of keywords they’re ranking for
  • Using Spyfu to get a list of keywords they are bidding for
  • Using Keyword Planner to get a list of possible keywords related to a primary term

Step 2: Identify Your Top Competitors

Once you have your keyword list, type those terms into Google and write down the sites that show up in the top 10. Often you’ll see the same sites appearing again and again. You’ll want to identify the sites that rank for many keywords in that niche, and add them to your list of competitors.

Again, SEMrush can be a valuable tool as you can type the domain into the search box and determine their organic traffic and number of organic keywords ranking in Google.

competitive-research-3

The more keywords they have ranking in SEMrush, especially for terms with high search volume, the more authoritative the domain is. You can also use SEMrush to search for competitors, but that list is not always accurate and should only be used as a starting point.

In this part of the process, it’s important that you find sites that are “truly” competitive with yours. Comparing “Mary’s Silver Earrings” with “Tiffany.com” would not be a fair comparison. It’s important to identify the long tail terms that you’d be targeting, and finding sites that rank for relevant terms. Do not include the large retail brands in your list – Amazon, Walmart, Bed Bath & Beyond, etc. – as they will simply skew your metrics.

Step 3: Analyze Your Competitors

Now that you know who your competitors are, you need to dive deep into the profile. You can start by grabbing general metrics for them. LinkResearchTools has a great tool called “Juice Tool” that can be used to get the general metrics for each competitor, including Link Velocity, Domain Authority, Inbound Links, Social Shares, Domain Age, and much more. Here’s an example of some of the data:

competitive-research-4

These numbers are not enough to form a comprehensive understanding of your competitors; it’s just a start. Next, you need a deep dive into their backlink profile.

1. Download Their Backlink Profile:

Using Ahrefs, you can sort by Domain Rank to view their backlinks from most to least authoritative. This way you can gain an idea of how many high quality links you’ll have to target.

competitive-research-5

2. Analyze Their Topical Authority:

Using Majestic’s backlink tool, analyze their topical trust flow and understand their semantic link profile.

competitive-research-6

3. Establish Industry Averages:

The Competitive Landscape Analyzer from LinkResearchTools is a wonderful way to establish industry averages, which will give you guidelines to follow when starting your campaign. Some of the metrics you can look at are:

competitive-research-7

Step 4: Social Media and Content Audit

Now that you have an understanding of who your competitors are and their backlink profile, you’ll need to research their content marketing and social media strategy. How often do they share updates on social media? What is their engagement ratio? How many active followers do they have? Here’s a template for a social media audit questionnaire that can be used as part of this process.

Evaluate their blog, Facebook, Twitter, LinkedIn and Pinterest accounts. This will help you determine how active you need to be in terms of creating and sharing content, and again, to determine your first targets in terms of follower acquisition and engagement ratio.

Step 5: Determine Your Barrier to Entry and Strategy

Once you have a thorough understanding of your competitors, you can create your strategy based on the averages from the data you uncovered.

First you’ll want to know how many links you’ll need to acquire, and the quality of those links, to start showing up in the search results. Of course this will be based on the averages of the sites ranking for the keywords you chose. It’s important to avoid keywords that have a strong presence of sites with massive domain authority, as mentioned above. These domains are tough to beat as they tend to be highly trusted and rewarded by Google.

Next you’ll want to determine how many links based on topical categories you’ll need. For example, if you need 100 links to start showing up, how many of those should be in your direct niche vs. a more generic niche or a related niche? Out of those 100 backlinks, how many should have a Domain Rank of 80 or more? Seventy or more? Between 60 and 40? Figure out a breakdown based on the industry averages so you can set targets for how to sculpt your backlink outreach and acquisition.

Finally, you’ll determine how many articles to share on your blog every week, how many should be keyword vs topically focused, how often to post on social media, how many followers to acquire, etc.

The data you acquire from this research will form the backbone of your SEO strategy, and will create the structure of your campaign. In such a difficult space, it’s important to arm yourself with data, otherwise you will easily waste resources without seeing a return on investment.

Monday, December 4, 2017

Introduction to keyword research for SEO

3 step guide providing you with some insights to ensure you are on the right track when researching the keywords of choice. So, you have identified the content, now what?


Step 1: Selecting the Keywords


  • Brainstorm list of ideas of words/phrases. Speak to your co-workers, friends, suppliers to your business of keywords and phrases they would associate to the content you are looking to create.
  • Imagine you are the customer. Put yourself in the shoes of your customer. What types of keywords and phrases are you going to refer to? Phone or email your customers, be open with them and ask them for their thoughts and ideas too
  • Competitors. This is a great resource tool. List out all your competitors who are selling the same product/service and begin to form research into the words and phrases they are targeting. Check out the source code of your competitors website products page - here you can search for Meta Keyword and Description tags where they may provide you with hints on keywords they are targeting.
  • Analytics. Consult your analytics and make a note of all the relevant keywords users have used to arrive at your website relating to the specific product/service.
  • Define your list. By now you should have a list of keywords and from this research you can now take it forward. Do not make the list too exhaustive, try to look to define and identify 10-15 keywords.

Step 2: Defining the List

  • Relevant and Popular keywords. From the initial selection of keywords in Step 1, now you need to define the list of keywords you want to target that you see as relevant and popular for search tactics.
  • Sign into the Google Keyword Tool which provides you with an insight into the search volume available for the range of keywords you have identified through Google e.g. how many searches are conducted on a monthly basis for that specific word or phrase.
  • Copy and Paste the keyword list you created in Step 1 into the keyword tool and choose 'Exact match' and set the region you are looking to target.
  • Record results. The Keyword tool will then generate a list of keyword search volume generated for a monthly average.
Keyword Tool

Step 3: Choosing the Keywords

Analyse local monthly searches. If you log in to your Google account this will then break down the keywords by a monthly average figure and by region (Local) rather than Global. From here you can check to see peaks in search volume throughout the year.

Monitor Keyword ideas underneath the keywords you included, Google will also provide a keyword ideas section - this is a list of alternative and associated keywords you might also want to consider that to drive potential search volume.
Do the keywords match your content?This is an important step, make sure the keywords you are looking to take forward match the content you have on your website. There is no point identifying keywords with lots of potential search volume if you don't have the content available on the website to support this.

Hopefully, this provides you with an actionable 3 step guide to creating and defining a list of keywords for your SEO campaign, I would be keen to hear your thoughts and ideas you may use.





Wednesday, November 29, 2017

Beginner’s Guide to Google Webmaster Tools

Are you looking for some love… from Google? Other than buying paid traffic through their AdWords program, the best way to get more traffic from them is through search engine optimization. But before you start optimizing your site, the first thing you should do is sign up for Google Webmaster Tools (GWT).

GWT is a free toolset provided by Google that helps you first understand what’s going on with your website. This way you make decisions based off of data instead of going in blindly.

Here is how GWT works.

Adding your website

add website to google webmaster tools

add a website to google webmaster tools

The first thing you need to do after you login to GWT (it’s free to sign up) is to add your website.

After you add your website you’ll have to verify that you actually own the website. You can do this through four ways:


  1. Add a DNS record to your domain’s configuration – You can use this option if you can sign in to your domain registrar or hosting provider and add a new DNS record.
  2. Add a meta tag to your site's homepage – You can choose this option if you can edit your site’s HTML.
  3. Upload an HTML file to your server – You can choose this option if you can upload new files to your site.
  4. Link your Google Analytics account to GWT – You can use this option if your site already has a Google Analytics tracking code that uses the asynchronous snippet. You must be an administrator on the analytics account for this to work.

Dashboard

google webmaster tools dashboard

google webmaster tools dashboard

Once your site is verified you’ll start seeing data on your website. Sometimes it can take a few hours before you see any data, but it’ll start rolling in.

The dashboard gives you a rough overview of everything from what keywords you are ranking for to how much traffic you are getting. In addition to that, you’ll see if the Google bot is experiencing any crawl errors when going to your website, the number of sites linking to yours, and how many pages Google has indexed.

Site Configuration

Just like everything else, Google isn’t perfect. So configuring your site can help them do a better job of ranking your website. When configuring there are a few areas that you should be familiar with:

Sitemaps


google webmaster tools sitemaps

google webmaster tools sitemaps


Submitting a sitemap will help Google determine what pages you have on your website so they can index them. If you don’t submit a sitemap they may not index all of the pages on your website, which means you won’t get as much traffic.

Sitemaps have to be submitted in an XML format and they can’t contain more than 50,000 URLs or be larger than 10 megs. If you exceed any of those limits, you need to split up your sitemap into multiple files and then submit them.

If you aren’t technical, you can go to XML Sitemaps to create a sitemap. All you have to do is enter your URL of your homepage and click “start”.

Once your sitemaps have been uploaded, Google will tell you how many of your URLs are being indexed. Don’t worry, it is common for them to not index all of your web pages. But your goal should still be to get as many pages indexed as possible.

Typically if pages aren’t being indexed it’s because the content on those pages isn't unique, the title tags and meta descriptions are generic, and not enough websites are linking to your internal pages.

Crawler access

google webmaster tools crawler access

google webmaster tools crawler access

There will be some pages on your website that you just don’t want Google to index. This could be private login areas, RSS feeds, or crucial data that you don’t want people accessing.

By creating a robots.txt file you can block not just Google, but all search engines from accessing web pages that you don’t want them to get their hands on. However, for highly sensitive areas of your website you may want to consider password protecting all relevant directories.

Through the robots.txt generator and tester, not only will you be able to create a robots.txt file, but you will be able to see if it is done correctly before you upload it to your server. It’s wise to do because the last thing you want to do is make a mistake and tell them not to index your whole website.

And if you accidentally mess up and find Google indexing pages that you don’t want them to index, you can request them to remove it from this section.


Sitelinks

google webmaster tools sitelinks

google webmaster tools sitelinks

Sitelinks are links to a site’s interior pages displayed on a Google search results page. Not all sites have sitelinks, but as you grow in popularity you’ll naturally get them. Google generates these links automatically, but you can remove sitelinks you don’t want.

Through this section, you can somewhat control what sitelinks show up when someone searches for your website. The reason you can’t fully control what pages show up here is that you can only block which pages you don’t want to appear, and you can’t pick which pages you want to appear.


Change of address

google webmaster tools change of address

google webmaster tools change of address

If you are looking to change the URL of your website, you better let Google know or else your traffic is going to decrease.

You can tell them through 4 easy steps:

Setup the new site – You have to get the new domain up and running. Make sure all your content is available for the public to see.
Redirect the old traffic – A 301 permanent redirect tells users and search engines that your site has permanently moved.
Add your new site to GWT – Make sure you also verify your new website.
Tell GWT your new domain – In the change of address section, you can select the new domain name of your website.

Settings

google webmaster tools settings

google webmaster tools settings

If your target audience is someone in a specific country, then you can select this option in GWT. For example, if my target customer for KISSmetrics lives in the United States, I would then tell GWT that my target audience lives in the United States.

In addition to that, you can select a preferred domain name. This is going to be http://yourdomain.com or http://www.yourdomain.com. Either one works, you just have to select which variation you prefer. The reason for picking one is that people may link to both versions of your domain and by selecting one Google will combine the links, which will help your rankings.

The last setting you should be worried about is crawl rate. If you feel that the Google bot needs to be crawling your website more often and faster then you can tell them to do so. Or you can just let them pick the crawl setting for your website. (this is typically the best option because if they crawl your website too often it can cause too much bot traffic going to your server and increase your hosting costs)

Your website on the web

Have you ever wondered how Google looks at your website? After all, it’s a search engine and not a human… so naturally it won’t be able to look at a website in the same way you do.

But luckily for you, through GWT you can see how Google views your website.


Search queries

google webmaster tools search queries


google webmaster tools search queries

Not only is it important to go after keywords that have high search volume, but it is important to make sure that you have a good click-through rate.

By monitoring the search queries page, you can work on improving your click-through rate so that people are more likely to click on your listing when they search. Typically you can do this by making your title tag and meta description more attractive as that is what people read before clicking through to your site.

Links to your website


google webmaster tools links to your site


google webmaster tools links to your site

The best way to increase your rankings on Google is to get more sites to link to you. Usually, this happens naturally if your website is providing valuable information to potential customers.

A good way to monitor your link growth is to continually monitor this area in GWT. In addition to that, make sure you monitor which pages people are linking to.

If your links aren’t growing fast enough consider writing relevant linkbait that could be submitted throughout the social web. Getting on the homepage of Digg.com can drive thousands of new links to your site.


Keywords

google webmaster tools links to your site

google webmaster tools keywords

You may have a good idea of what keywords you want to rank for, but that may not be consistent with what Google is ranking you for. Under the keywords section, you can see what keywords your website is the most related to.

You can also see what variations of each keyword that are also relevant to your website. For example, some people misspell keywords and you can find out which misspellings your website is most relevant for.

And if those aren’t the keywords you care to rank for, you can then use that data to adjust the content on your website.


Internal links

Google Webmaster Tools Internal Links
Google Webmaster Tools Internal Links

Linking your web pages together is a great way to get more Google love. For example, if you want your about page to rank for your company name make sure you link to it multiple times.

If you don’t link to your internal pages, they will not get as much PageRank and they won’t place as well in the search listings.

In addition to that, this data will also help you determine which pages Google feels is the most important. For example, if you look at the image above you’ll see that website owner felt that their about pages was one of the most important pages on their website. So naturally, Google felt that as well.


Subscriber stats

google webmaster tools subscriber stats

google webmaster tools subscriber stats

If you have a blog, this area of GWT will be useful for you. If you don’t, it won’t.

Under the subscriber stats section, you can see which of your blog posts are the most subscribed to by Google’s feed reader. You can then take that data and write more posts that are similar to your popular ones. And of course, you can stop writing blog posts similar to the ones that no one subscribed to, as readers probably didn’t enjoy them as much.

On a side note, if you want to track your RSS growth, you can also check out Feedburner, which will allow you to track how popular your feed is.


Diagnostics

Websites are made by humans, so don’t expect them to be perfect. Your code may be a bit messed up, and even worse, your website may contain malware.

Through the diagnostics section, you can figure out what’s wrong with your site and how you can fix it.


Malware

google webmaster tools malware

google webmaster tools malware

If you have malware on your server, you should see a message here. If you don’t, GWT won’t show you much.

The reason it is important to not have malware on your server is that Google tries not to rank infected sites high because if someone goes to an infected site, their computer may get infected. If you do happen to have malware, make sure you clean it up.

Crawl errors

google webmaster tools crawl errors

google webmaster tools crawl errors

The crawl errors section will show you if there any problems that relate to your site on the web or on a mobile phone. The most common error that you’ll see is a 404 error, which means Google’s bot can’t find that page.

The most common reason that you’ll see 404 errors is that other websites sometimes link to pages that don’t exist on your website or used to exist.

What you need to do is get a list of all of the websites that are linking to dead pages on your site and hit them up. When emailing them, ask them if they can change that link to a valid page.

Or if you see a lot of people linking to a dead page on your site, you can always 301 redirect that old URL to the new URL.


Crawl stats

google webmaster tools crawl stats

google webmaster tools crawl stats

If you have thousands of pages on your site, then you should expect Google to crawl most of these pages on a daily or weekly basis. If they aren’t, then something is wrong.

Through the graphs and data tables that GWT provides, you should be able to get a good sense if they are crawling enough pages on your website. If they aren’t, consider adjusting the crawl rate under the settings tab.

HTML suggestions

google webmaster tools html suggestions

google webmaster tools HTML suggestions

When Googlebot crawls your site, it may find some issues with your content. These issues won’t prevent your site from appearing in Google search results, but addressing them may help boost your traffic.

The most common problem is related to title tags and meta descriptions. If every page on your site has unique and detailed title tags and meta descriptions, you should be fine. At the same time, you also have to make sure your title tags aren’t too short or too long.

And if that isn’t the case, then you can go through the URLs that GWT tells you they have an issue with and fix it.

Labs

GWT regularly tests new features out. The easiest way to find out about these new features is to go through the lab's sections.

Fetch as Googlebot

google webmaster tools fetch as Googlebot

google webmaster tools fetch as Googlebot

With Fetch as Googlebot, you can see exactly how a page appears to Google. All you have to do is type in a URL and GWT will tell if they could successful see it or not.

There currently isn’t a ton of data that GWT is showing in this area, but I expect this to change in the future.


Sidewiki

google webmaster tools Sidewiki


google webmaster tools Sidewiki

If you’re a webmaster, you can leave a special Sidewiki entry on pages of your site. You can choose to leave a master entry for the whole site, or create page specific entries to engage your visitors.

All you have to do is:

If you’ve successfully validated your account in GWT, you will see an option to write as the page owner.
Select the “Write as the page owner” checkbox in the entry form. If you’d like to leave a master entry across the whole site, also select the “Show this page owner entry on all pages…” checkbox.
Click Publish.

Site performance

google webmaster tools site performance

google webmaster tools site performance

Your website’s load time is one of the most important things you should be monitoring. Every month you should be making sure you improve this number because if your website is too slow your Google traffic may drop.

Numerous website owners have seen a positive increase in their traffic by improving their website load time.

If you aren’t sure how fast your website should load, don’t worry. Google will tell you if your website is too slow or quick enough.


Video sitemaps

google webmaster tools video sitemaps
google webmaster tools video sitemaps

If you have a video on your site, you want to make sure you include those raw video files in your sitemap. This way, Google can index them as they may not be able to find them otherwise.

This will help ensure that your videos are getting the traffic they deserve from Google video search.

If you are trying to create a video sitemap, this page should explain how to do so.


Conclusion

GWT is a useful tool that’s free. If you aren’t making use of it, you should start doing so now. The reason it’s worth using is that it will help guide you and tell you what to do if you want to improve your Google traffic.

Monday, November 27, 2017

Google PageRank Technology Explanation

The following extract from Google Technology web page at
http://www.google.com/technology/


how to improve Google Page Rank

PageRank Introduction

Google runs on a unique combination of advanced hardware and software. The speed you experience can be attributed in part to the efficiency of our search algorithm and partly to the thousands of low-cost PC's we've networked together to create a superfast search engine.

The heart of our software is PageRank™, a system for ranking web pages developed by our founders Larry Page and Sergey Brin at Stanford University. And while we have dozens of engineers working to improve every aspect of Google on a daily basis, PageRank continues to provide the basis for all of our web search tools.

PageRank Explained

PageRank relies on the uniquely democratic nature of the web by using its vast link structure as an indicator of an individual page's value. In essence, Google interprets a link from page A to page B as a vote, by page A, for page B. But, Google looks at more than the sheer volume of votes, or links a page receives; it also analyzes the page that casts the vote. Votes cast by pages that are themselves "important" weigh more heavily and help to make other pages "important."

Important, high-quality sites receive a higher PageRank, which Google remembers each time it conducts a search. Of course, important pages mean nothing to you if they don't match your query. So, Google combines PageRank with sophisticated text-matching techniques to find pages that are both important and relevant to your search. Google goes far beyond the number of times a term appears on a page and examines all aspects of the page's content (and the content of the pages linking to it) to determine if it's a good match for your query.

PageRank Integrity

Google's complex, automated methods make human tampering with our results extremely difficult. And though we do run relevant ads above and next to our results, Google does not sell placement within the results themselves (i.e., no one can buy a higher PageRank). A Google search is an easy, honest and objective way to find high-quality websites with information relevant to your search.

Saturday, November 25, 2017

Latest Google SEO Updates & Algorithms

The digital world is now more hyped-up, dynamic and influential than ever before. It is more focused and competitive as well. With the end goal for you to achieve high search engine rankings and to maintain them, you have to look after the latest Google SEO updates 2017. This is the initial move towards staying aware of the latest SEO trends and remaining focused.

The SEO updates are directly proportional to Google’s algorithm updates that search engines receive. Since Google is the pioneer in the search marketing, new changes in Google Algorithm Update 2017 are vital to enhancing optimization of your website. Website admins need to have phenomenal understanding of all the latest Google search engine algorithm updates and related procedures, as only this can let them know latest SEO updates 2017 essential to optimize websites, ensure better DA and high rankings in SERPs.

     Largely, Google is centered around enhancing its web search administrations for online users, and by keeping a track of changes in Google’s Algorithm updates, marketers can increase the ranking of their sites. Google has a long history of famous SEO algorithm updates that channelize ranking mechanism of SERPs.


To find latest Google SEO updates, marketers need to check latest updates on the following SEO algorithms-


10 Google SEO Updates & Algorithm Changes in 2017

1.) Google Hummingbird Update





Introduced in August 2013, Google Hummingbird Update is Google’s new search algorithm that plays a significant role in deciding the ranking of websites. It is made up of 200+ factors that can affect search results and website ranking. The biggest changes made in Hummingbird were the capability to have a sharp eye on mobile marketing, which is not surprising at all given the explosion of the smartphones in recent years. The name ‘Hummingbird’ comes from its ability to be “precise and fast” and it is mainly designed to better focus on the meaning of a phrase or keyword rather than individual keywords. Hummingbird looks at the entire phrase to decipher the meaning of that phrase. Google Hummingbird SEO updates aid pages matching the meaning do better in search results.


SEO new updates related to Hummingbird


  • Application of meaning technology to billions of pages from across the web
  • Use of Knowledge Graph facts to ensure better search results
  • Easy recognition of Keyword stuffing
  • Effectiveness of Long-tail keywords

2.)  Google Penguin Update


Google propelled the Penguin Update in April 2012 to catch the websites that are spamming Google’s search results. This update is mainly aimed at decreasing search rankings of websites that violate Google’s Webmaster Guidelines and use black-hat SEO techniques to artificially increase the ranking of their websites, by obtaining or buying links through some wrong practices. The primary reason behind this update was to penalize websites that use manipulative techniques for achieving high rankings.  As per Google’s estimates, Penguin influences approximately 3.1% of search queries in English, and approximately 3% of queries in languages like German, Arabic, and Chinese, and an even much bigger percentage of them in “highly spammed” language categories. Pre-Penguin sites normally utilized some negative external link building tactics to rank well in SERPs and boost their traffics. However, once Penguin was introduced it implied that content was vital and those with incredible content would be recognized and those with little or spammy content would be punished.

Image result for google penguin

Some confirmed Google Penguin SEO updates are


  • Penguin 1 – on April 24, 2012 (impacting around 3.1% of queries)
  • Penguin 2 – on May 26, 2012 (impacting less than 0.1%)
  • Penguin 3 – on October 5, 2012 (impacting around 0.3% of queries)
  • Penguin 4 (a.k.a. Penguin 2.0)- on May 22, 2013 (impacting 2.3% of queries)
  • Penguin 5 (a.k.a. Penguin 2.1)- on October 4, 2013 (impacting around 1% of queries)
  • Penguin 6 (a.k.a. Penguin 3.0-) on October 17, 2014 (impacting less than 1% English queries). On December 1, 2014, Google confirmed that the update was still rolling out with webmasters continuing to report significant fluctuations in Google SEO updates 2014. 
  • Penguin 7 (a.k.a. Penguin 4.0)- on September 23, 2016

3.) Google Panda Update


Google's  Panda

Google’s Panda Update is acquainted in February 2011 and it is known as the powerful search filter implied to stop sites with low-quality content from making their way into top search results of Google. Panda is updated every once in a while. At the point when this happens, sites already hit; may get away, if then they have rolled out the correct improvements according to Panda updates. Through its different updates, Panda can likewise catch sites that got away some time recently. Google Panda was quite effective in affecting the ranking of entire sites or a specific section rather than individual pages on a site.

Some important Google SEO updates according to Google Panda Update are


  • No Multiple Pages with the Same Keyword
  • Get Rid of Auto-generated Content and Roundup/Comparison Type of Pages
  • No Pages with 1-2 Paragraphs of Text Only
  • No Scraped Content
  • Panda Likes New Content
  • Be Careful with Affiliate Links and Ads
  • Too Many Outbound Links with Keywords are bad

4.) Google Pigeon Update

Google Pigeon Update


Propelled on July 24, 2014, for U.S. English results, Google Pigeon Update” is another SEO algorithm update introduced to give more valuable, significant and exact local search results that are attached more closely to conventional web search ranking factors. Google said that this new SEO algorithm enhances their distance and location tracking parameters in more result-oriented manner. The changes made through Google Pigeon Update will also affect search results shown in Google Maps as this update lets Google provide search results based on the user location and listing at hand in the local directory. The main purpose of introducing Google Pigeon Update is to provide preferences to local search results in SERPs and that is why this is extremely beneficial for local businesses. 

Latest updates in SEO based on Google Pigeon Updates are

  • Location Matters More Than Ever
  • Don’t Over-Optimize Your Website
  • Strong Domains Matter more

5.) Google Mobile-Friendly Update

Introduced On April 21, 2015, Google introduced its Mobile-Friendly search algorithm that is intended to give a lift to mobile-friendly sites pages in Google’s mobile search results. The change is significant to the point that the date it happened is being allured by a variety of names such as Mobilegeddon, Mobilepocalyse, Apocalypse or Mobocalypse. One of the ideal approaches to get ready is to test that Google considers your site pages to be mobile friendly by utilizing its Mobile-Friendly Test tool. It is very effective in including approaches that bring more mobile-friendliness in SEO campaigns.

  • Latest Google Mobile-Friendly SEO updates are
  • Google mobile-friendly testing tool now has API access
  • Google may pick desktop over AMP page for the mobile-first index
  • Google begins mobile-first indexing, using mobile content for all search rankings
  • Google will show AMP URLs before App deep link URLs in mobile results
  • Google says page speed ranking factor to use mobile page speed for mobile sites

6.) Google: Payday Update

Propelled on June 11, 2013, Google Payday Update was a new Google search algorithm focused at cleaning up list items related to “spammy queries” such as payday loans or pornographic or some other kinds of heavily spammed queries. It can be understood as a set of algorithm updates for the Google search engine results initiated to identify and penalize websites that use different kinds of search engine spam techniques (also known as  Black Hat SEO or spamdexing) for improving their rankings for particular search queries that are actually “spammy”. Let’s have a look at some recent Google updates: 


Recent Google Payday updates are

  • Google Payday Loan 1.0
  • Google Payday Loan 2.0
  • Google Payday Loan 3.0

7.) Google: Pirate Update

Google’s Pirate Update

Introduced in August 2012, Google’s Pirate Update is a filter that prevents sites that have many copyright infringement reports, as documented through Google’s DMCA system. It is periodically updated and at the point when updates happen, websites beforehand affected may get away, if they have made the correct changes. It may likewise catch new websites that circumvented being caught recently; in addition, it may also release ‘false positives’ about those who were caught.

Some of the Google Pirate SEO latest updates-

  • The Pirate Update Penalized Websites That Received A High Volume Of Copyright Violation Reports
  • The Pirate Update Is A Win For Media And Artists
  • Getting A Page Removed From The Index Requires Valid Documentation

8.) Google: EMD Update

Launched in September 2012, The EMD (Exact Match Domain) Update is a filter used by Google to impede low-quality sites from positioning admirably just on the grounds that they had words that match search terms in their domains. At the point when a crisp EMD Update happens, sites that have enhanced their content may recover great rankings. New sites that comprise poor content or all those that were previously missed by Google EMD updates may get caught. Likewise, “false positives” may also get discharged.


9.) Google: Top Heavy Update

Google Top Heavy update was propelled in January 2012 as a way to avoid sites that were “top heavy” with advertisements from positioning well in Google search listings. Top Heavy is updated repeatedly, and at the point when a Top Heavy Update occurs, websites that have evacuated extreme advertisements may recapture their lost rankings. New sites considered as “top heavy” may get caught again with a new Top-heavy update.


Some of the Google Top Heavy SEO Updates

  • Google Updates Its Page Layout Algorithm To Go After Sites “Top Heavy” With Ads
  • Have The Same Ad-To-Organic Ratio As Google Search? Then You Might Be Safe From The Top Heavy Penalty
  • The Top Heavy Update: Pages With Too Many Ads Above The Fold Now Penalized By Google’s “Page Layout” Algorithm

 10.) Google Page Rank Update

Google Page Rank

On the off chance that you do SEO or are involved in search marketing, you will for sure come across Google Page Rank Topic eventually. Page Rank is Google’s arrangement of tallying link votes and figuring out which pages are most critical in view of them. These scores are then utilized alongside numerous other things to figure out whether a page will rank well in a search or not. However, some of the experts find Page Rank as a metric that is out of date and deprecated now and they suggest marketers to not waste time on them. Google came up with its Last Toolbar Page Rank Update in 5/6 December 2013 and thereafter it declared- “PageRank is something that we haven’t updated for over a year now, and we’re probably not going to be updating it again going forward at least the Toolbar version.”

Some of the Toolbar Page Rank Updates that decide SEO new updates are-


  • Toolbar Page Rank Updates released on 5/6 December 2013 (LAST PAGERANK UPDATE EVER)
  • Toolbar Page Rank Updates released on 4 February 2013
  • Toolbar Page Rank Updates released on 7 November 2012
  • Toolbar Page Rank Updates released on 2 August 2012
  • Toolbar Page Rank Updates released on 2 May 2012
  • Toolbar Page Rank Updates released on 7 February 2012
  • Toolbar Page Rank Updates released on 7 November 2011
  • Toolbar Page Rank Updates released on 1st Week August 2011
  • Toolbar Page Rank Updates released in July 2011
              Once you are aware of all the Google search algorithm updates, the next step is to be in constant touch with top SEO resources to know all the latest Google SEO updates.












Thursday, November 23, 2017

Search Engines and Algorithms

In this article, we are going to look at search engine algorithms, how diverse they are, what they have in common, why it’s important to know their differences, and how to make this information work for you in SEO. There is something for everyone, from the novice to the expert. Over the course of this series, we will look at optimizing your site for specific search engines, as well. The top six major players we will look at in this series are AOL Search, Google, and AskJeeves in the first article; Yahoo! and AltaVista in part 2; MSN in part 3; and in the last article, part 4, we’ll look at MetaSearch Engines.

Just about everyone knows what a search engine is.  Whenever you have a question, want to look up the address of your favorite restaurants or need to make a qualified online purchase, chances are, you visit a search engine on the Internet.

If you’ve ever used two different search engines to conduct the same search query, then you will have noticed that the results weren’t the same.  So why will the same query on different search engines produce different results? Part of the answer is because not all search engine indexes are going to be exactly the same, as it depends on what the spiders find or what information humans have submitted to the database. But more importantly, not every search engine uses the same algorithm to search through their databases.  An algorithm is what the search engines use to determine the relevance of the information in the database to what the user is searching for.

What is a Search Engine Algorithm?


A search algorithm is defined as a math formula that takes a problem as input and returns a solution to the problem, usually after evaluating a number of possible solutions.  A search engine algorithm uses keywords as the input problem, and returns relevant search results as the solution, matching these keywords to the results stored in its database.  These keywords are determined by search engine spiders that analyze web page content and keyword relevancy based on a math formula that will vary from one search engine to the next.

Types of Information that Factor into Algorithms


Some services collect information on the queries individual users submit to search services, the pages they look at subsequently, and the time spent on each page. This information is used to return results pages that most users visit after initiating the query. For this technique to succeed, large amounts of data need to be collected for each query. Unfortunately, the potential set of queries to which this technique applies is small, and this method is open to spamming.

Another approach involves analyzing the links between pages on the web on the assumption that pages on the topic link to each other, and authoritative pages tend to point to other authoritative pages.  By analyzing how pages link to each other, an engine can both determine what a page is about, and whether that page is considered relevant.  Similarly, some search engine algorithms figure internal link navigation into the picture.  Search engine spiders follow internal links to weigh how each page relates to another, and considers the ease of navigation.  If a spider runs into a dead-end page with no way out, this can be weighed into the algorithms as a penalty.

Original search engine databases were made up of all human classified data.  This is a fairly archaic approach, but there are still many directories that make up search engine databases, like the Open Directory (also known as DMOZ), that are entirely classified by people.  Some search engine data are still managed by humans, but after the algorithmic spiders have collected the information.

One of the elements that a search engine algorithm scans for is the frequency and location of keywords on a web page. Those with higher frequency are typically considered more relevant.  This is referred to as keyword density.  It’s also figured into some search engine algorithms where the keywords are located on a page.

Like keywords and usage information, meta tag information has been abused.  Many search engines do not factor in meta tags any longer, due to web spam.  But some still do, and most look at Title and Descriptions.  There are many other factors that search engine algorithms figure into the calculation of relevant results.  Some utilize information like how long the website has been on the Internet, and still others may weigh structural issues, errors encountered, and more.

Why are Search Engines so different?


Search engine algorithms are highly secret, competitive things.  No one knows exactly what each search engine weighs and what importance it attaches to each factor in the formula, which leads to a lot of assumption, speculation, and guesswork.  Each search engine employs its own filters to remove spam, and even have their own differing guidelines in determining what web spam is!

Search engines generally implement two or three major updates every year.  One simply has to follow the patent filings to know this.  Even if you are not interested in the patent itself, they may give you a heads up to possible changes that will be following in a search engine algorithm.

Another reason that search engines are so diverse is the widespread use of technology filters to sort out web spam.  Some search engines change their algorithms to include certain filters, while others don’t change the basic algorithms, yet implement filters on top of the basic calculations.  According to the dictionary, filters are essentially “higher-order functions that take a predicate and a list and returns those elements of the list for which the predicate is true.”  A simpler way to think of search engine filters are like you would think of a water purifier:  the water passes through a device made of porous material that removes unwanted impurities.  A search engine filter also seeks to remove unwanted “impurities” from its results.


How to achieve good SEO among all the mainstream search engines

It may seem like a daunting task to please all of the many search engines out there, as there are thousands. Still, there are several pieces of advice I would give in order to streamline your SEO efforts among all of the major search engines.

Do your keyword research.  This means learning how many times your keywords are being searched for every day, what competition you have, and how these relate to your content on each page.

Select 3 – 5 phrases to optimize each page, instead of a whole slew of keywords.  The more keywords you try to use, the more diluted your keyword density becomes.  Use keywords for each page, not geared toward the entire site.  (Keyword density ranges for all of the above websites run from .07% for Google to 1.7% for Yahoo.)

Write unique, compelling titles for each page.  Titles are still important to all five top search engines.

Focus on writing unique content that adds value to users and incorporates valuable keywords.  Write for ideas, not keywords.  When you are finished with your ideas, your keywords should result from the content, and not the content from the keywords.

Ensure site architecture and design do not prohibit thorough search engine crawling.  Clear navigation, readable code, no broken links, and validated markup will allow you to not only make it easier for the search engine spiders to crawl your website, but this will also mean better page stickiness to your visitors.

Build high-quality, relevant, inbound links.  Since all search engines rely upon inbound links to rank the relevancy of a site, it is good to concentrate on this area.  Don’t inflate your backlinks with artificial links.  Build organic, natural links, and keep the sites you link to relevant.  Avoid link directories on your website.  In the case of all search engines, the more links, the better.

Be familiar with how the top search engines work. When you do your homework and understand the workings of the search engines, it will help you determine what search engines look for in a website that it considers relevant, and what practices to stay away from.

Stay with it. SEO is not a one-time project. Continual growth in content, links, and pages is required for long-term success.  The key to keeping your site relevant to search engines is fresh and unique content.  This applies to keywords, links and content.

Don’t sweat the small stuff.  Changes in search engine algorithms should not affect too much you if you follow the basic principles of SEO.  While tweaking may be necessary, there is certainly no cause for alarm.

So why is understanding the search engines so important?  As my son would say, “Mom, you gotta know your enemy.”  Well, not necessarily the enemy in this case, but what he says has a grain of truth in it when it comes to search engines.  You have to be familiar with the search engines if you ever hope to optimize for them.  The more you familiarize yourself with the way they work and treat websites in relevancy of search results, the better your chances will be for ranking in those search engines.

If you optimize only for Google or Yahoo, however, you’ll be missing out on two-thirds of all of the search engine traffic, and only optimizing for MSN means you’ll lose about 85% of potential search traffic.  Optimizing for several search engines is going to be more easily done if you understand the basic concepts of each search engine.

Does this mean that any time a search engine changes you have to go running to change your website?  No.  In fact, one of the most reasonable pieces of advice I can give you when a search engine algorithm changes is not to panic.  However, when a major update is implemented, being familiar with the search engines is your best possible defense; that way, the new filters don’t force upon you a huge learning curve.

So while trying to figure out the exact formulas for each search engine’s algorithm will be a never-ending, insurmountable source of frustration for any SEO, focusing instead on ways to make all of them happy at once seems like a better use of your time, if not just as frustrating.  But if you follow the basic concepts of SEO, then you’ll not only be found in the search engine results but be a more relaxed individual in the process.

Differences in the Mainstream Search Engines


Knowing your “enemy” will go a long way in helping you understand why your website may be performing the way it is in a particular search engine.  Search engine algorithms change constantly, some daily.  There is no way to know exactly when or how to predict changes in a search engine, but there are trends to follow.  There are some major factors that, even though they all weigh in their relevancy scores, each are weighted differently.

The breakdown of the market share of search queries that these six search engines currently control is listed below.  These figures were taken from Forbes.com for August Google: 37.3%, up from 36.5% in July

  • Yahoo: 29.7% down from 30.3% in July
  • MSN: 15.8% up from 15.5% in July
  • AOL: 9.6% down from 9.9% in July
  • AskJeeves: 3%
  • AltaVista: 1.6%2005.

AOL Search


AOL claims to be the first major search engine featuring “clustering” technology. Search results will be automatically clustered into relevant topics and displayed alongside the list of general results, using technology licensed by Vivisimo.  However, now, 80% of searches done on AOL currently use Google’s databases.  For all intents and purposes, optimizing for AOL Search is currently very similar to optimizing for Google.  Of the top ten results for “pets” in both AOL Search and Google, there is no distinction of differing results, and for the search term “pet feeding supplies,” there is only one variance.  Similar results were achieved for other randomly chosen keywords and phrases.  AOL Search queries make up approximately 16% of searches on the web.

Google


Google uses what is commonly known as the Hilltop Algorithm, or the “Austin Update.”  Hilltop emphasizes the voting power of what it considers “authority sites.” These are websites or pages that Google assesses to be of strong importance on a particular keyword topic.

For the purpose of quality search results and especially to make search engines resistant against automatically generated web pages based upon the analysis of content specific ranking criteria (doorway pages), the concept of link popularity was developed.  Link Popularity weighs very heavily into Google’s PageRank algorithm.

According to WikiPedia.org, “PageRank is a family of algorithms for assigning numerical weightings to hyperlinked documents (or web pages) indexed by a search engine.”

There are over 100 factors that are calculated into Google’s PageRank algorithm.  What exact weight is given to each factor is unknown, but we do know that backlinks are probably one of the highest weighed factors when determining relevancy.  Some other factors might be keyword density, date of domain registration or age of the website, clean design, error-free pages, text navigation, the absence of spam, and more.  We aren’t sure if these things are factored into the algorithm, or are just used in filters employed by Google.

Google is probably the most mysterious of all of the top search engines and is currently the most popular search engine.  Google is estimated to have about 35% of the searches made.  While currently only ranking at #3, we could easily expect this to change in the near future.


AskJeeves

AskJeeves is a directory built entirely by human editors, with results presented as a set of related questions for which “answers,” or links, exist.  Enter your question in plain English into the text box, then click on the “Ask” button and Jeeves does the rest.  Now, the search engine’s being rebranded, and its mascot, the butler, is soon to be history.

This search engine is powered by Teoma.  A few observations by fellow CEOs have noted: Teoma holds orphans in its index longer than any other search engine.  So if you utilize redirects, whether temporary, 302, redirects, or permanent 301s, there is a high chance that your old URLs will stay in the index for a long time.  The search engine’s re-index schedule in the past used to be between three and six months, with occasionally a couple crawls in one month, but the level of re-index has always been fairly sporadic and mainly shallow, and it seems that sponsored listings receive far more attention than any site listed organically in their index.  There is no way to submit your website to Teoma’s or AskJeeve’s index unless you pay for sponsored listings, or just wait for the robot, Ask, to crawl your site.  This can take a long time.

So for a pay-per-click platform, AskJeeves seems to be a good place to advertise.  AskJeeves displays sponsored listings from Google’s AdWords program.  If you have pretty good qualified traffic from your AdWords or you have an Amazon.com store, then you’ll have better luck with AskJeeves.  The key word here is “qualified.”  There seems to be a higher amount of click fraud, by nature, with AskJeeves.  I believe the primary reason for this is the way that AskJeeves SERPs are shown.  In Google, you have sponsored listings that are distinctly away from the organic listings: either highlighted in blue at the top or down the right side.  With AskJeeves, your organic results look just like pay-for-placement listings.



In the next article, we will look at AltaVista and Yahoo’s search engines in depth, be showing you what they specifically look for in a web page to deem it relevant to its search results.  Stay tuned!


Wednesday, November 22, 2017

Google Search Engine Architecture

Introduction

Search Engine refers to a huge database of internet resources such as web pages, newsgroups, programs, images etc. It helps to locate information on World Wide Web.

User can search for any information by passing query in form of keywords or phrase. It then searches for relevant information in its database and return to the user.

internet_technologies_tutorial

Search Engine Components

Generally there are three basic components of a search engine as listed below:


  1. Web Crawler
  2. Database
  3. Search Interfaces


Web crawler

It is also known as spider or bots. It is a software component that traverses the web to gather information.


Database

All the information on the web is stored in database. It consists of huge web resources.


Search Interfaces

This component is an interface between user and the database. It helps the user to search through the database.

Search Engine Working


Web crawler, database and the search interface are the major component of a search engine that actually makes search engine to work. Search engines make use of Boolean expression AND, OR, NOT to restrict and widen the results of a search. Following are the steps that are performed by the search engine:

  • The search engine looks for the keyword in the index for predefined database instead of going directly to the web to search for the keyword.
  • It then uses software to search for the information in the database. This software component is known as web crawler.
  • Once web crawler finds the pages, the search engine then shows the relevant web pages as a result. These retrieved web pages generally include title of page, size of text portion, first several sentences etc.
These search criteria may vary from one search engine to the other. The retrieved information is ranked according to various factors such as frequency of keywords, relevancy of information, links etc.
  • User can click on any of the search results to open it.

Architecture

The search engine architecture comprises of the three basic layers listed below:
  • Content collection and refinement.
  • Search core
  • User and application interfaces
internet_technologies_tutorial



Search Engine Processing

Indexing Process


Indexing process comprises of the following three tasks:

  • Text acquisition

  • Text transformation

  • Index creation

TEXT ACQUISITION
It identifies and stores documents for indexing.

TEXT TRANSFORMATION
It transforms document into index terms or features.

INDEX CREATION
It takes index terms created by text transformations and create data structures to support fast searching.

Examples

Following are the several search engines available today:




Tuesday, November 21, 2017

Google Search Operators

What are Google search operators?

Google search operators are special characters and commands (sometimes called “advanced operators”) that extend the capabilities of regular text searches. Search operators can be useful for everything from content research to technical SEO audits.

How do I use search operators?

You can enter search operators directly into the Google search box, just as you would a text search:


Except in special cases (such as the “in” operator), Google will return standard organic results.

Google search operators cheat sheet

You can find all of the major organic search operators below, broken up into three categories: “Basic”, “Advanced”, and “Unreliable”. Basic search operators are operators that modify standard text searches.

I. Basic Search Operators


" " :"nikola tesla"

Put any phrase in quotes to force Google to use exact-match. On single words, prevents synonyms.

OR: tesla OR edison

Google search defaults to logical AND between terms. Specify "OR" for a logical OR (ALL-CAPS).

|: tesla | edison

The pipe (|) operator is identical to "OR". Useful if your Caps-lock is broken :)

( ): (tesla OR edison) alternating current

Use parentheses to group operators and control the order in which they execute.

-: tesla -motors

Put minus (-) in front of any term (including operators) to exclude that term from the results.

*: tesla "rock * roll"

An asterisk (*) acts as a wild-card and will match on any word.

#..# tesla announcement 2015..2017

Use (..) with numbers on either side to match on any integer in that range of numbers.

$: tesla deposit $1000

Search prices with the dollar sign ($). You can combine ($) and (.) for exact prices, like $19.99.

€: €9,99 lunch deals

Search prices with the Euro sign (€). Most other currency signs don't seem to be honored by Google.

in: 250 kph in mph

Use "in" to convert between two equivalent units. This returns a special, Knowledge Card style result.

Advanced search operators are special commands that modify searches and may require additional parameters (such as a domain name). Advanced operators are typically used to narrow searches and drill deeper into results.


II. Advanced Search Operators

intitle: intitle:"tesla vs edison"

Search only in the page's title for a word or phrase. Use exact-match (quotes) for phrases.

allintitle: allintitle: tesla vs edison

Search the page title for every individual term following "allintitle:". Same as multiple intitle:'s.

inurl: tesla announcements inurl:2016

Look for a word or phrase (in quotes) in the document URL. Can combine with other terms.

allinurl: allinurl: amazon field-keywords nikon

Search the URL for every individual term following "allinurl:". Same as multiple inurl:'s.

intext: intext:"orbi vs eero vs google wifi"

Search for a word or phrase (in quotes), but only in the body/document text.

allintext: allintext: orbi eero google wifi

Search the body text for every individual term following "allintext:". Same as multiple intexts:'s.

filetype: "tesla announcements" filetype:pdf

Match only a specific file type. Some examples include PDF, DOC, XLS, PPT, and TXT.

related: related:nytimes.com

Return sites that are related to a target domain. Only works for larger domains.

AROUND(X) tesla AROUND(3) edison


Returns results where the two terms/phrases are within (X) words of each other.