Wednesday, November 29, 2017

Beginner’s Guide to Google Webmaster Tools

Are you looking for some love… from Google? Other than buying paid traffic through their AdWords program, the best way to get more traffic from them is through search engine optimization. But before you start optimizing your site, the first thing you should do is sign up for Google Webmaster Tools (GWT).

GWT is a free toolset provided by Google that helps you first understand what’s going on with your website. This way you make decisions based off of data instead of going in blindly.

Here is how GWT works.

Adding your website

add website to google webmaster tools

add a website to google webmaster tools

The first thing you need to do after you login to GWT (it’s free to sign up) is to add your website.

After you add your website you’ll have to verify that you actually own the website. You can do this through four ways:


  1. Add a DNS record to your domain’s configuration – You can use this option if you can sign in to your domain registrar or hosting provider and add a new DNS record.
  2. Add a meta tag to your site's homepage – You can choose this option if you can edit your site’s HTML.
  3. Upload an HTML file to your server – You can choose this option if you can upload new files to your site.
  4. Link your Google Analytics account to GWT – You can use this option if your site already has a Google Analytics tracking code that uses the asynchronous snippet. You must be an administrator on the analytics account for this to work.

Dashboard

google webmaster tools dashboard

google webmaster tools dashboard

Once your site is verified you’ll start seeing data on your website. Sometimes it can take a few hours before you see any data, but it’ll start rolling in.

The dashboard gives you a rough overview of everything from what keywords you are ranking for to how much traffic you are getting. In addition to that, you’ll see if the Google bot is experiencing any crawl errors when going to your website, the number of sites linking to yours, and how many pages Google has indexed.

Site Configuration

Just like everything else, Google isn’t perfect. So configuring your site can help them do a better job of ranking your website. When configuring there are a few areas that you should be familiar with:

Sitemaps


google webmaster tools sitemaps

google webmaster tools sitemaps


Submitting a sitemap will help Google determine what pages you have on your website so they can index them. If you don’t submit a sitemap they may not index all of the pages on your website, which means you won’t get as much traffic.

Sitemaps have to be submitted in an XML format and they can’t contain more than 50,000 URLs or be larger than 10 megs. If you exceed any of those limits, you need to split up your sitemap into multiple files and then submit them.

If you aren’t technical, you can go to XML Sitemaps to create a sitemap. All you have to do is enter your URL of your homepage and click “start”.

Once your sitemaps have been uploaded, Google will tell you how many of your URLs are being indexed. Don’t worry, it is common for them to not index all of your web pages. But your goal should still be to get as many pages indexed as possible.

Typically if pages aren’t being indexed it’s because the content on those pages isn't unique, the title tags and meta descriptions are generic, and not enough websites are linking to your internal pages.

Crawler access

google webmaster tools crawler access

google webmaster tools crawler access

There will be some pages on your website that you just don’t want Google to index. This could be private login areas, RSS feeds, or crucial data that you don’t want people accessing.

By creating a robots.txt file you can block not just Google, but all search engines from accessing web pages that you don’t want them to get their hands on. However, for highly sensitive areas of your website you may want to consider password protecting all relevant directories.

Through the robots.txt generator and tester, not only will you be able to create a robots.txt file, but you will be able to see if it is done correctly before you upload it to your server. It’s wise to do because the last thing you want to do is make a mistake and tell them not to index your whole website.

And if you accidentally mess up and find Google indexing pages that you don’t want them to index, you can request them to remove it from this section.


Sitelinks

google webmaster tools sitelinks

google webmaster tools sitelinks

Sitelinks are links to a site’s interior pages displayed on a Google search results page. Not all sites have sitelinks, but as you grow in popularity you’ll naturally get them. Google generates these links automatically, but you can remove sitelinks you don’t want.

Through this section, you can somewhat control what sitelinks show up when someone searches for your website. The reason you can’t fully control what pages show up here is that you can only block which pages you don’t want to appear, and you can’t pick which pages you want to appear.


Change of address

google webmaster tools change of address

google webmaster tools change of address

If you are looking to change the URL of your website, you better let Google know or else your traffic is going to decrease.

You can tell them through 4 easy steps:

Setup the new site – You have to get the new domain up and running. Make sure all your content is available for the public to see.
Redirect the old traffic – A 301 permanent redirect tells users and search engines that your site has permanently moved.
Add your new site to GWT – Make sure you also verify your new website.
Tell GWT your new domain – In the change of address section, you can select the new domain name of your website.

Settings

google webmaster tools settings

google webmaster tools settings

If your target audience is someone in a specific country, then you can select this option in GWT. For example, if my target customer for KISSmetrics lives in the United States, I would then tell GWT that my target audience lives in the United States.

In addition to that, you can select a preferred domain name. This is going to be http://yourdomain.com or http://www.yourdomain.com. Either one works, you just have to select which variation you prefer. The reason for picking one is that people may link to both versions of your domain and by selecting one Google will combine the links, which will help your rankings.

The last setting you should be worried about is crawl rate. If you feel that the Google bot needs to be crawling your website more often and faster then you can tell them to do so. Or you can just let them pick the crawl setting for your website. (this is typically the best option because if they crawl your website too often it can cause too much bot traffic going to your server and increase your hosting costs)

Your website on the web

Have you ever wondered how Google looks at your website? After all, it’s a search engine and not a human… so naturally it won’t be able to look at a website in the same way you do.

But luckily for you, through GWT you can see how Google views your website.


Search queries

google webmaster tools search queries


google webmaster tools search queries

Not only is it important to go after keywords that have high search volume, but it is important to make sure that you have a good click-through rate.

By monitoring the search queries page, you can work on improving your click-through rate so that people are more likely to click on your listing when they search. Typically you can do this by making your title tag and meta description more attractive as that is what people read before clicking through to your site.

Links to your website


google webmaster tools links to your site


google webmaster tools links to your site

The best way to increase your rankings on Google is to get more sites to link to you. Usually, this happens naturally if your website is providing valuable information to potential customers.

A good way to monitor your link growth is to continually monitor this area in GWT. In addition to that, make sure you monitor which pages people are linking to.

If your links aren’t growing fast enough consider writing relevant linkbait that could be submitted throughout the social web. Getting on the homepage of Digg.com can drive thousands of new links to your site.


Keywords

google webmaster tools links to your site

google webmaster tools keywords

You may have a good idea of what keywords you want to rank for, but that may not be consistent with what Google is ranking you for. Under the keywords section, you can see what keywords your website is the most related to.

You can also see what variations of each keyword that are also relevant to your website. For example, some people misspell keywords and you can find out which misspellings your website is most relevant for.

And if those aren’t the keywords you care to rank for, you can then use that data to adjust the content on your website.


Internal links

Google Webmaster Tools Internal Links
Google Webmaster Tools Internal Links

Linking your web pages together is a great way to get more Google love. For example, if you want your about page to rank for your company name make sure you link to it multiple times.

If you don’t link to your internal pages, they will not get as much PageRank and they won’t place as well in the search listings.

In addition to that, this data will also help you determine which pages Google feels is the most important. For example, if you look at the image above you’ll see that website owner felt that their about pages was one of the most important pages on their website. So naturally, Google felt that as well.


Subscriber stats

google webmaster tools subscriber stats

google webmaster tools subscriber stats

If you have a blog, this area of GWT will be useful for you. If you don’t, it won’t.

Under the subscriber stats section, you can see which of your blog posts are the most subscribed to by Google’s feed reader. You can then take that data and write more posts that are similar to your popular ones. And of course, you can stop writing blog posts similar to the ones that no one subscribed to, as readers probably didn’t enjoy them as much.

On a side note, if you want to track your RSS growth, you can also check out Feedburner, which will allow you to track how popular your feed is.


Diagnostics

Websites are made by humans, so don’t expect them to be perfect. Your code may be a bit messed up, and even worse, your website may contain malware.

Through the diagnostics section, you can figure out what’s wrong with your site and how you can fix it.


Malware

google webmaster tools malware

google webmaster tools malware

If you have malware on your server, you should see a message here. If you don’t, GWT won’t show you much.

The reason it is important to not have malware on your server is that Google tries not to rank infected sites high because if someone goes to an infected site, their computer may get infected. If you do happen to have malware, make sure you clean it up.

Crawl errors

google webmaster tools crawl errors

google webmaster tools crawl errors

The crawl errors section will show you if there any problems that relate to your site on the web or on a mobile phone. The most common error that you’ll see is a 404 error, which means Google’s bot can’t find that page.

The most common reason that you’ll see 404 errors is that other websites sometimes link to pages that don’t exist on your website or used to exist.

What you need to do is get a list of all of the websites that are linking to dead pages on your site and hit them up. When emailing them, ask them if they can change that link to a valid page.

Or if you see a lot of people linking to a dead page on your site, you can always 301 redirect that old URL to the new URL.


Crawl stats

google webmaster tools crawl stats

google webmaster tools crawl stats

If you have thousands of pages on your site, then you should expect Google to crawl most of these pages on a daily or weekly basis. If they aren’t, then something is wrong.

Through the graphs and data tables that GWT provides, you should be able to get a good sense if they are crawling enough pages on your website. If they aren’t, consider adjusting the crawl rate under the settings tab.

HTML suggestions

google webmaster tools html suggestions

google webmaster tools HTML suggestions

When Googlebot crawls your site, it may find some issues with your content. These issues won’t prevent your site from appearing in Google search results, but addressing them may help boost your traffic.

The most common problem is related to title tags and meta descriptions. If every page on your site has unique and detailed title tags and meta descriptions, you should be fine. At the same time, you also have to make sure your title tags aren’t too short or too long.

And if that isn’t the case, then you can go through the URLs that GWT tells you they have an issue with and fix it.

Labs

GWT regularly tests new features out. The easiest way to find out about these new features is to go through the lab's sections.

Fetch as Googlebot

google webmaster tools fetch as Googlebot

google webmaster tools fetch as Googlebot

With Fetch as Googlebot, you can see exactly how a page appears to Google. All you have to do is type in a URL and GWT will tell if they could successful see it or not.

There currently isn’t a ton of data that GWT is showing in this area, but I expect this to change in the future.


Sidewiki

google webmaster tools Sidewiki


google webmaster tools Sidewiki

If you’re a webmaster, you can leave a special Sidewiki entry on pages of your site. You can choose to leave a master entry for the whole site, or create page specific entries to engage your visitors.

All you have to do is:

If you’ve successfully validated your account in GWT, you will see an option to write as the page owner.
Select the “Write as the page owner” checkbox in the entry form. If you’d like to leave a master entry across the whole site, also select the “Show this page owner entry on all pages…” checkbox.
Click Publish.

Site performance

google webmaster tools site performance

google webmaster tools site performance

Your website’s load time is one of the most important things you should be monitoring. Every month you should be making sure you improve this number because if your website is too slow your Google traffic may drop.

Numerous website owners have seen a positive increase in their traffic by improving their website load time.

If you aren’t sure how fast your website should load, don’t worry. Google will tell you if your website is too slow or quick enough.


Video sitemaps

google webmaster tools video sitemaps
google webmaster tools video sitemaps

If you have a video on your site, you want to make sure you include those raw video files in your sitemap. This way, Google can index them as they may not be able to find them otherwise.

This will help ensure that your videos are getting the traffic they deserve from Google video search.

If you are trying to create a video sitemap, this page should explain how to do so.


Conclusion

GWT is a useful tool that’s free. If you aren’t making use of it, you should start doing so now. The reason it’s worth using is that it will help guide you and tell you what to do if you want to improve your Google traffic.

Monday, November 27, 2017

Google PageRank Technology Explanation

The following extract from Google Technology web page at
http://www.google.com/technology/


how to improve Google Page Rank

PageRank Introduction

Google runs on a unique combination of advanced hardware and software. The speed you experience can be attributed in part to the efficiency of our search algorithm and partly to the thousands of low-cost PC's we've networked together to create a superfast search engine.

The heart of our software is PageRank™, a system for ranking web pages developed by our founders Larry Page and Sergey Brin at Stanford University. And while we have dozens of engineers working to improve every aspect of Google on a daily basis, PageRank continues to provide the basis for all of our web search tools.

PageRank Explained

PageRank relies on the uniquely democratic nature of the web by using its vast link structure as an indicator of an individual page's value. In essence, Google interprets a link from page A to page B as a vote, by page A, for page B. But, Google looks at more than the sheer volume of votes, or links a page receives; it also analyzes the page that casts the vote. Votes cast by pages that are themselves "important" weigh more heavily and help to make other pages "important."

Important, high-quality sites receive a higher PageRank, which Google remembers each time it conducts a search. Of course, important pages mean nothing to you if they don't match your query. So, Google combines PageRank with sophisticated text-matching techniques to find pages that are both important and relevant to your search. Google goes far beyond the number of times a term appears on a page and examines all aspects of the page's content (and the content of the pages linking to it) to determine if it's a good match for your query.

PageRank Integrity

Google's complex, automated methods make human tampering with our results extremely difficult. And though we do run relevant ads above and next to our results, Google does not sell placement within the results themselves (i.e., no one can buy a higher PageRank). A Google search is an easy, honest and objective way to find high-quality websites with information relevant to your search.

Saturday, November 25, 2017

Latest Google SEO Updates & Algorithms

The digital world is now more hyped-up, dynamic and influential than ever before. It is more focused and competitive as well. With the end goal for you to achieve high search engine rankings and to maintain them, you have to look after the latest Google SEO updates 2017. This is the initial move towards staying aware of the latest SEO trends and remaining focused.

The SEO updates are directly proportional to Google’s algorithm updates that search engines receive. Since Google is the pioneer in the search marketing, new changes in Google Algorithm Update 2017 are vital to enhancing optimization of your website. Website admins need to have phenomenal understanding of all the latest Google search engine algorithm updates and related procedures, as only this can let them know latest SEO updates 2017 essential to optimize websites, ensure better DA and high rankings in SERPs.

     Largely, Google is centered around enhancing its web search administrations for online users, and by keeping a track of changes in Google’s Algorithm updates, marketers can increase the ranking of their sites. Google has a long history of famous SEO algorithm updates that channelize ranking mechanism of SERPs.


To find latest Google SEO updates, marketers need to check latest updates on the following SEO algorithms-


10 Google SEO Updates & Algorithm Changes in 2017

1.) Google Hummingbird Update





Introduced in August 2013, Google Hummingbird Update is Google’s new search algorithm that plays a significant role in deciding the ranking of websites. It is made up of 200+ factors that can affect search results and website ranking. The biggest changes made in Hummingbird were the capability to have a sharp eye on mobile marketing, which is not surprising at all given the explosion of the smartphones in recent years. The name ‘Hummingbird’ comes from its ability to be “precise and fast” and it is mainly designed to better focus on the meaning of a phrase or keyword rather than individual keywords. Hummingbird looks at the entire phrase to decipher the meaning of that phrase. Google Hummingbird SEO updates aid pages matching the meaning do better in search results.


SEO new updates related to Hummingbird


  • Application of meaning technology to billions of pages from across the web
  • Use of Knowledge Graph facts to ensure better search results
  • Easy recognition of Keyword stuffing
  • Effectiveness of Long-tail keywords

2.)  Google Penguin Update


Google propelled the Penguin Update in April 2012 to catch the websites that are spamming Google’s search results. This update is mainly aimed at decreasing search rankings of websites that violate Google’s Webmaster Guidelines and use black-hat SEO techniques to artificially increase the ranking of their websites, by obtaining or buying links through some wrong practices. The primary reason behind this update was to penalize websites that use manipulative techniques for achieving high rankings.  As per Google’s estimates, Penguin influences approximately 3.1% of search queries in English, and approximately 3% of queries in languages like German, Arabic, and Chinese, and an even much bigger percentage of them in “highly spammed” language categories. Pre-Penguin sites normally utilized some negative external link building tactics to rank well in SERPs and boost their traffics. However, once Penguin was introduced it implied that content was vital and those with incredible content would be recognized and those with little or spammy content would be punished.

Image result for google penguin

Some confirmed Google Penguin SEO updates are


  • Penguin 1 – on April 24, 2012 (impacting around 3.1% of queries)
  • Penguin 2 – on May 26, 2012 (impacting less than 0.1%)
  • Penguin 3 – on October 5, 2012 (impacting around 0.3% of queries)
  • Penguin 4 (a.k.a. Penguin 2.0)- on May 22, 2013 (impacting 2.3% of queries)
  • Penguin 5 (a.k.a. Penguin 2.1)- on October 4, 2013 (impacting around 1% of queries)
  • Penguin 6 (a.k.a. Penguin 3.0-) on October 17, 2014 (impacting less than 1% English queries). On December 1, 2014, Google confirmed that the update was still rolling out with webmasters continuing to report significant fluctuations in Google SEO updates 2014. 
  • Penguin 7 (a.k.a. Penguin 4.0)- on September 23, 2016

3.) Google Panda Update


Google's  Panda

Google’s Panda Update is acquainted in February 2011 and it is known as the powerful search filter implied to stop sites with low-quality content from making their way into top search results of Google. Panda is updated every once in a while. At the point when this happens, sites already hit; may get away, if then they have rolled out the correct improvements according to Panda updates. Through its different updates, Panda can likewise catch sites that got away some time recently. Google Panda was quite effective in affecting the ranking of entire sites or a specific section rather than individual pages on a site.

Some important Google SEO updates according to Google Panda Update are


  • No Multiple Pages with the Same Keyword
  • Get Rid of Auto-generated Content and Roundup/Comparison Type of Pages
  • No Pages with 1-2 Paragraphs of Text Only
  • No Scraped Content
  • Panda Likes New Content
  • Be Careful with Affiliate Links and Ads
  • Too Many Outbound Links with Keywords are bad

4.) Google Pigeon Update

Google Pigeon Update


Propelled on July 24, 2014, for U.S. English results, Google Pigeon Update” is another SEO algorithm update introduced to give more valuable, significant and exact local search results that are attached more closely to conventional web search ranking factors. Google said that this new SEO algorithm enhances their distance and location tracking parameters in more result-oriented manner. The changes made through Google Pigeon Update will also affect search results shown in Google Maps as this update lets Google provide search results based on the user location and listing at hand in the local directory. The main purpose of introducing Google Pigeon Update is to provide preferences to local search results in SERPs and that is why this is extremely beneficial for local businesses. 

Latest updates in SEO based on Google Pigeon Updates are

  • Location Matters More Than Ever
  • Don’t Over-Optimize Your Website
  • Strong Domains Matter more

5.) Google Mobile-Friendly Update

Introduced On April 21, 2015, Google introduced its Mobile-Friendly search algorithm that is intended to give a lift to mobile-friendly sites pages in Google’s mobile search results. The change is significant to the point that the date it happened is being allured by a variety of names such as Mobilegeddon, Mobilepocalyse, Apocalypse or Mobocalypse. One of the ideal approaches to get ready is to test that Google considers your site pages to be mobile friendly by utilizing its Mobile-Friendly Test tool. It is very effective in including approaches that bring more mobile-friendliness in SEO campaigns.

  • Latest Google Mobile-Friendly SEO updates are
  • Google mobile-friendly testing tool now has API access
  • Google may pick desktop over AMP page for the mobile-first index
  • Google begins mobile-first indexing, using mobile content for all search rankings
  • Google will show AMP URLs before App deep link URLs in mobile results
  • Google says page speed ranking factor to use mobile page speed for mobile sites

6.) Google: Payday Update

Propelled on June 11, 2013, Google Payday Update was a new Google search algorithm focused at cleaning up list items related to “spammy queries” such as payday loans or pornographic or some other kinds of heavily spammed queries. It can be understood as a set of algorithm updates for the Google search engine results initiated to identify and penalize websites that use different kinds of search engine spam techniques (also known as  Black Hat SEO or spamdexing) for improving their rankings for particular search queries that are actually “spammy”. Let’s have a look at some recent Google updates: 


Recent Google Payday updates are

  • Google Payday Loan 1.0
  • Google Payday Loan 2.0
  • Google Payday Loan 3.0

7.) Google: Pirate Update

Google’s Pirate Update

Introduced in August 2012, Google’s Pirate Update is a filter that prevents sites that have many copyright infringement reports, as documented through Google’s DMCA system. It is periodically updated and at the point when updates happen, websites beforehand affected may get away, if they have made the correct changes. It may likewise catch new websites that circumvented being caught recently; in addition, it may also release ‘false positives’ about those who were caught.

Some of the Google Pirate SEO latest updates-

  • The Pirate Update Penalized Websites That Received A High Volume Of Copyright Violation Reports
  • The Pirate Update Is A Win For Media And Artists
  • Getting A Page Removed From The Index Requires Valid Documentation

8.) Google: EMD Update

Launched in September 2012, The EMD (Exact Match Domain) Update is a filter used by Google to impede low-quality sites from positioning admirably just on the grounds that they had words that match search terms in their domains. At the point when a crisp EMD Update happens, sites that have enhanced their content may recover great rankings. New sites that comprise poor content or all those that were previously missed by Google EMD updates may get caught. Likewise, “false positives” may also get discharged.


9.) Google: Top Heavy Update

Google Top Heavy update was propelled in January 2012 as a way to avoid sites that were “top heavy” with advertisements from positioning well in Google search listings. Top Heavy is updated repeatedly, and at the point when a Top Heavy Update occurs, websites that have evacuated extreme advertisements may recapture their lost rankings. New sites considered as “top heavy” may get caught again with a new Top-heavy update.


Some of the Google Top Heavy SEO Updates

  • Google Updates Its Page Layout Algorithm To Go After Sites “Top Heavy” With Ads
  • Have The Same Ad-To-Organic Ratio As Google Search? Then You Might Be Safe From The Top Heavy Penalty
  • The Top Heavy Update: Pages With Too Many Ads Above The Fold Now Penalized By Google’s “Page Layout” Algorithm

 10.) Google Page Rank Update

Google Page Rank

On the off chance that you do SEO or are involved in search marketing, you will for sure come across Google Page Rank Topic eventually. Page Rank is Google’s arrangement of tallying link votes and figuring out which pages are most critical in view of them. These scores are then utilized alongside numerous other things to figure out whether a page will rank well in a search or not. However, some of the experts find Page Rank as a metric that is out of date and deprecated now and they suggest marketers to not waste time on them. Google came up with its Last Toolbar Page Rank Update in 5/6 December 2013 and thereafter it declared- “PageRank is something that we haven’t updated for over a year now, and we’re probably not going to be updating it again going forward at least the Toolbar version.”

Some of the Toolbar Page Rank Updates that decide SEO new updates are-


  • Toolbar Page Rank Updates released on 5/6 December 2013 (LAST PAGERANK UPDATE EVER)
  • Toolbar Page Rank Updates released on 4 February 2013
  • Toolbar Page Rank Updates released on 7 November 2012
  • Toolbar Page Rank Updates released on 2 August 2012
  • Toolbar Page Rank Updates released on 2 May 2012
  • Toolbar Page Rank Updates released on 7 February 2012
  • Toolbar Page Rank Updates released on 7 November 2011
  • Toolbar Page Rank Updates released on 1st Week August 2011
  • Toolbar Page Rank Updates released in July 2011
              Once you are aware of all the Google search algorithm updates, the next step is to be in constant touch with top SEO resources to know all the latest Google SEO updates.












Thursday, November 23, 2017

Search Engines and Algorithms

In this article, we are going to look at search engine algorithms, how diverse they are, what they have in common, why it’s important to know their differences, and how to make this information work for you in SEO. There is something for everyone, from the novice to the expert. Over the course of this series, we will look at optimizing your site for specific search engines, as well. The top six major players we will look at in this series are AOL Search, Google, and AskJeeves in the first article; Yahoo! and AltaVista in part 2; MSN in part 3; and in the last article, part 4, we’ll look at MetaSearch Engines.

Just about everyone knows what a search engine is.  Whenever you have a question, want to look up the address of your favorite restaurants or need to make a qualified online purchase, chances are, you visit a search engine on the Internet.

If you’ve ever used two different search engines to conduct the same search query, then you will have noticed that the results weren’t the same.  So why will the same query on different search engines produce different results? Part of the answer is because not all search engine indexes are going to be exactly the same, as it depends on what the spiders find or what information humans have submitted to the database. But more importantly, not every search engine uses the same algorithm to search through their databases.  An algorithm is what the search engines use to determine the relevance of the information in the database to what the user is searching for.

What is a Search Engine Algorithm?


A search algorithm is defined as a math formula that takes a problem as input and returns a solution to the problem, usually after evaluating a number of possible solutions.  A search engine algorithm uses keywords as the input problem, and returns relevant search results as the solution, matching these keywords to the results stored in its database.  These keywords are determined by search engine spiders that analyze web page content and keyword relevancy based on a math formula that will vary from one search engine to the next.

Types of Information that Factor into Algorithms


Some services collect information on the queries individual users submit to search services, the pages they look at subsequently, and the time spent on each page. This information is used to return results pages that most users visit after initiating the query. For this technique to succeed, large amounts of data need to be collected for each query. Unfortunately, the potential set of queries to which this technique applies is small, and this method is open to spamming.

Another approach involves analyzing the links between pages on the web on the assumption that pages on the topic link to each other, and authoritative pages tend to point to other authoritative pages.  By analyzing how pages link to each other, an engine can both determine what a page is about, and whether that page is considered relevant.  Similarly, some search engine algorithms figure internal link navigation into the picture.  Search engine spiders follow internal links to weigh how each page relates to another, and considers the ease of navigation.  If a spider runs into a dead-end page with no way out, this can be weighed into the algorithms as a penalty.

Original search engine databases were made up of all human classified data.  This is a fairly archaic approach, but there are still many directories that make up search engine databases, like the Open Directory (also known as DMOZ), that are entirely classified by people.  Some search engine data are still managed by humans, but after the algorithmic spiders have collected the information.

One of the elements that a search engine algorithm scans for is the frequency and location of keywords on a web page. Those with higher frequency are typically considered more relevant.  This is referred to as keyword density.  It’s also figured into some search engine algorithms where the keywords are located on a page.

Like keywords and usage information, meta tag information has been abused.  Many search engines do not factor in meta tags any longer, due to web spam.  But some still do, and most look at Title and Descriptions.  There are many other factors that search engine algorithms figure into the calculation of relevant results.  Some utilize information like how long the website has been on the Internet, and still others may weigh structural issues, errors encountered, and more.

Why are Search Engines so different?


Search engine algorithms are highly secret, competitive things.  No one knows exactly what each search engine weighs and what importance it attaches to each factor in the formula, which leads to a lot of assumption, speculation, and guesswork.  Each search engine employs its own filters to remove spam, and even have their own differing guidelines in determining what web spam is!

Search engines generally implement two or three major updates every year.  One simply has to follow the patent filings to know this.  Even if you are not interested in the patent itself, they may give you a heads up to possible changes that will be following in a search engine algorithm.

Another reason that search engines are so diverse is the widespread use of technology filters to sort out web spam.  Some search engines change their algorithms to include certain filters, while others don’t change the basic algorithms, yet implement filters on top of the basic calculations.  According to the dictionary, filters are essentially “higher-order functions that take a predicate and a list and returns those elements of the list for which the predicate is true.”  A simpler way to think of search engine filters are like you would think of a water purifier:  the water passes through a device made of porous material that removes unwanted impurities.  A search engine filter also seeks to remove unwanted “impurities” from its results.


How to achieve good SEO among all the mainstream search engines

It may seem like a daunting task to please all of the many search engines out there, as there are thousands. Still, there are several pieces of advice I would give in order to streamline your SEO efforts among all of the major search engines.

Do your keyword research.  This means learning how many times your keywords are being searched for every day, what competition you have, and how these relate to your content on each page.

Select 3 – 5 phrases to optimize each page, instead of a whole slew of keywords.  The more keywords you try to use, the more diluted your keyword density becomes.  Use keywords for each page, not geared toward the entire site.  (Keyword density ranges for all of the above websites run from .07% for Google to 1.7% for Yahoo.)

Write unique, compelling titles for each page.  Titles are still important to all five top search engines.

Focus on writing unique content that adds value to users and incorporates valuable keywords.  Write for ideas, not keywords.  When you are finished with your ideas, your keywords should result from the content, and not the content from the keywords.

Ensure site architecture and design do not prohibit thorough search engine crawling.  Clear navigation, readable code, no broken links, and validated markup will allow you to not only make it easier for the search engine spiders to crawl your website, but this will also mean better page stickiness to your visitors.

Build high-quality, relevant, inbound links.  Since all search engines rely upon inbound links to rank the relevancy of a site, it is good to concentrate on this area.  Don’t inflate your backlinks with artificial links.  Build organic, natural links, and keep the sites you link to relevant.  Avoid link directories on your website.  In the case of all search engines, the more links, the better.

Be familiar with how the top search engines work. When you do your homework and understand the workings of the search engines, it will help you determine what search engines look for in a website that it considers relevant, and what practices to stay away from.

Stay with it. SEO is not a one-time project. Continual growth in content, links, and pages is required for long-term success.  The key to keeping your site relevant to search engines is fresh and unique content.  This applies to keywords, links and content.

Don’t sweat the small stuff.  Changes in search engine algorithms should not affect too much you if you follow the basic principles of SEO.  While tweaking may be necessary, there is certainly no cause for alarm.

So why is understanding the search engines so important?  As my son would say, “Mom, you gotta know your enemy.”  Well, not necessarily the enemy in this case, but what he says has a grain of truth in it when it comes to search engines.  You have to be familiar with the search engines if you ever hope to optimize for them.  The more you familiarize yourself with the way they work and treat websites in relevancy of search results, the better your chances will be for ranking in those search engines.

If you optimize only for Google or Yahoo, however, you’ll be missing out on two-thirds of all of the search engine traffic, and only optimizing for MSN means you’ll lose about 85% of potential search traffic.  Optimizing for several search engines is going to be more easily done if you understand the basic concepts of each search engine.

Does this mean that any time a search engine changes you have to go running to change your website?  No.  In fact, one of the most reasonable pieces of advice I can give you when a search engine algorithm changes is not to panic.  However, when a major update is implemented, being familiar with the search engines is your best possible defense; that way, the new filters don’t force upon you a huge learning curve.

So while trying to figure out the exact formulas for each search engine’s algorithm will be a never-ending, insurmountable source of frustration for any SEO, focusing instead on ways to make all of them happy at once seems like a better use of your time, if not just as frustrating.  But if you follow the basic concepts of SEO, then you’ll not only be found in the search engine results but be a more relaxed individual in the process.

Differences in the Mainstream Search Engines


Knowing your “enemy” will go a long way in helping you understand why your website may be performing the way it is in a particular search engine.  Search engine algorithms change constantly, some daily.  There is no way to know exactly when or how to predict changes in a search engine, but there are trends to follow.  There are some major factors that, even though they all weigh in their relevancy scores, each are weighted differently.

The breakdown of the market share of search queries that these six search engines currently control is listed below.  These figures were taken from Forbes.com for August Google: 37.3%, up from 36.5% in July

  • Yahoo: 29.7% down from 30.3% in July
  • MSN: 15.8% up from 15.5% in July
  • AOL: 9.6% down from 9.9% in July
  • AskJeeves: 3%
  • AltaVista: 1.6%2005.

AOL Search


AOL claims to be the first major search engine featuring “clustering” technology. Search results will be automatically clustered into relevant topics and displayed alongside the list of general results, using technology licensed by Vivisimo.  However, now, 80% of searches done on AOL currently use Google’s databases.  For all intents and purposes, optimizing for AOL Search is currently very similar to optimizing for Google.  Of the top ten results for “pets” in both AOL Search and Google, there is no distinction of differing results, and for the search term “pet feeding supplies,” there is only one variance.  Similar results were achieved for other randomly chosen keywords and phrases.  AOL Search queries make up approximately 16% of searches on the web.

Google


Google uses what is commonly known as the Hilltop Algorithm, or the “Austin Update.”  Hilltop emphasizes the voting power of what it considers “authority sites.” These are websites or pages that Google assesses to be of strong importance on a particular keyword topic.

For the purpose of quality search results and especially to make search engines resistant against automatically generated web pages based upon the analysis of content specific ranking criteria (doorway pages), the concept of link popularity was developed.  Link Popularity weighs very heavily into Google’s PageRank algorithm.

According to WikiPedia.org, “PageRank is a family of algorithms for assigning numerical weightings to hyperlinked documents (or web pages) indexed by a search engine.”

There are over 100 factors that are calculated into Google’s PageRank algorithm.  What exact weight is given to each factor is unknown, but we do know that backlinks are probably one of the highest weighed factors when determining relevancy.  Some other factors might be keyword density, date of domain registration or age of the website, clean design, error-free pages, text navigation, the absence of spam, and more.  We aren’t sure if these things are factored into the algorithm, or are just used in filters employed by Google.

Google is probably the most mysterious of all of the top search engines and is currently the most popular search engine.  Google is estimated to have about 35% of the searches made.  While currently only ranking at #3, we could easily expect this to change in the near future.


AskJeeves

AskJeeves is a directory built entirely by human editors, with results presented as a set of related questions for which “answers,” or links, exist.  Enter your question in plain English into the text box, then click on the “Ask” button and Jeeves does the rest.  Now, the search engine’s being rebranded, and its mascot, the butler, is soon to be history.

This search engine is powered by Teoma.  A few observations by fellow CEOs have noted: Teoma holds orphans in its index longer than any other search engine.  So if you utilize redirects, whether temporary, 302, redirects, or permanent 301s, there is a high chance that your old URLs will stay in the index for a long time.  The search engine’s re-index schedule in the past used to be between three and six months, with occasionally a couple crawls in one month, but the level of re-index has always been fairly sporadic and mainly shallow, and it seems that sponsored listings receive far more attention than any site listed organically in their index.  There is no way to submit your website to Teoma’s or AskJeeve’s index unless you pay for sponsored listings, or just wait for the robot, Ask, to crawl your site.  This can take a long time.

So for a pay-per-click platform, AskJeeves seems to be a good place to advertise.  AskJeeves displays sponsored listings from Google’s AdWords program.  If you have pretty good qualified traffic from your AdWords or you have an Amazon.com store, then you’ll have better luck with AskJeeves.  The key word here is “qualified.”  There seems to be a higher amount of click fraud, by nature, with AskJeeves.  I believe the primary reason for this is the way that AskJeeves SERPs are shown.  In Google, you have sponsored listings that are distinctly away from the organic listings: either highlighted in blue at the top or down the right side.  With AskJeeves, your organic results look just like pay-for-placement listings.



In the next article, we will look at AltaVista and Yahoo’s search engines in depth, be showing you what they specifically look for in a web page to deem it relevant to its search results.  Stay tuned!


Wednesday, November 22, 2017

Google Search Engine Architecture

Introduction

Search Engine refers to a huge database of internet resources such as web pages, newsgroups, programs, images etc. It helps to locate information on World Wide Web.

User can search for any information by passing query in form of keywords or phrase. It then searches for relevant information in its database and return to the user.

internet_technologies_tutorial

Search Engine Components

Generally there are three basic components of a search engine as listed below:


  1. Web Crawler
  2. Database
  3. Search Interfaces


Web crawler

It is also known as spider or bots. It is a software component that traverses the web to gather information.


Database

All the information on the web is stored in database. It consists of huge web resources.


Search Interfaces

This component is an interface between user and the database. It helps the user to search through the database.

Search Engine Working


Web crawler, database and the search interface are the major component of a search engine that actually makes search engine to work. Search engines make use of Boolean expression AND, OR, NOT to restrict and widen the results of a search. Following are the steps that are performed by the search engine:

  • The search engine looks for the keyword in the index for predefined database instead of going directly to the web to search for the keyword.
  • It then uses software to search for the information in the database. This software component is known as web crawler.
  • Once web crawler finds the pages, the search engine then shows the relevant web pages as a result. These retrieved web pages generally include title of page, size of text portion, first several sentences etc.
These search criteria may vary from one search engine to the other. The retrieved information is ranked according to various factors such as frequency of keywords, relevancy of information, links etc.
  • User can click on any of the search results to open it.

Architecture

The search engine architecture comprises of the three basic layers listed below:
  • Content collection and refinement.
  • Search core
  • User and application interfaces
internet_technologies_tutorial



Search Engine Processing

Indexing Process


Indexing process comprises of the following three tasks:

  • Text acquisition

  • Text transformation

  • Index creation

TEXT ACQUISITION
It identifies and stores documents for indexing.

TEXT TRANSFORMATION
It transforms document into index terms or features.

INDEX CREATION
It takes index terms created by text transformations and create data structures to support fast searching.

Examples

Following are the several search engines available today:




Tuesday, November 21, 2017

Google Search Operators

What are Google search operators?

Google search operators are special characters and commands (sometimes called “advanced operators”) that extend the capabilities of regular text searches. Search operators can be useful for everything from content research to technical SEO audits.

How do I use search operators?

You can enter search operators directly into the Google search box, just as you would a text search:


Except in special cases (such as the “in” operator), Google will return standard organic results.

Google search operators cheat sheet

You can find all of the major organic search operators below, broken up into three categories: “Basic”, “Advanced”, and “Unreliable”. Basic search operators are operators that modify standard text searches.

I. Basic Search Operators


" " :"nikola tesla"

Put any phrase in quotes to force Google to use exact-match. On single words, prevents synonyms.

OR: tesla OR edison

Google search defaults to logical AND between terms. Specify "OR" for a logical OR (ALL-CAPS).

|: tesla | edison

The pipe (|) operator is identical to "OR". Useful if your Caps-lock is broken :)

( ): (tesla OR edison) alternating current

Use parentheses to group operators and control the order in which they execute.

-: tesla -motors

Put minus (-) in front of any term (including operators) to exclude that term from the results.

*: tesla "rock * roll"

An asterisk (*) acts as a wild-card and will match on any word.

#..# tesla announcement 2015..2017

Use (..) with numbers on either side to match on any integer in that range of numbers.

$: tesla deposit $1000

Search prices with the dollar sign ($). You can combine ($) and (.) for exact prices, like $19.99.

€: €9,99 lunch deals

Search prices with the Euro sign (€). Most other currency signs don't seem to be honored by Google.

in: 250 kph in mph

Use "in" to convert between two equivalent units. This returns a special, Knowledge Card style result.

Advanced search operators are special commands that modify searches and may require additional parameters (such as a domain name). Advanced operators are typically used to narrow searches and drill deeper into results.


II. Advanced Search Operators

intitle: intitle:"tesla vs edison"

Search only in the page's title for a word or phrase. Use exact-match (quotes) for phrases.

allintitle: allintitle: tesla vs edison

Search the page title for every individual term following "allintitle:". Same as multiple intitle:'s.

inurl: tesla announcements inurl:2016

Look for a word or phrase (in quotes) in the document URL. Can combine with other terms.

allinurl: allinurl: amazon field-keywords nikon

Search the URL for every individual term following "allinurl:". Same as multiple inurl:'s.

intext: intext:"orbi vs eero vs google wifi"

Search for a word or phrase (in quotes), but only in the body/document text.

allintext: allintext: orbi eero google wifi

Search the body text for every individual term following "allintext:". Same as multiple intexts:'s.

filetype: "tesla announcements" filetype:pdf

Match only a specific file type. Some examples include PDF, DOC, XLS, PPT, and TXT.

related: related:nytimes.com

Return sites that are related to a target domain. Only works for larger domains.

AROUND(X) tesla AROUND(3) edison


Returns results where the two terms/phrases are within (X) words of each other.















Monday, November 20, 2017

How Do Search Engines Work?

Every search engine has three main functions: crawling (to discover content), indexing (to track and store content), and retrieval (to fetch relevant content when users query the search engine).

Crawling

Crawling is where it all begins: the acquisition of data about a website.

This involves scanning sites and collecting details about each page: titles, images, keywords, other linked pages, etc. Different crawlers may also look for different details, like page layouts, where advertisements are placed, whether links are crammed in, etc.

But how is a website crawled? An automated bot (called a “spider”) visits page after page as quickly as possible, using page links to find where to go next. Even in the earliest days, Google’s spiders could read several hundred pages per second. Nowadays, it’s in the thousands.

How Do Search Engines Work? web crawler diagram


When a web crawler visits a page, it collects every link on the page and adds them to its list of next pages to visit. It goes to the next page in its list, collects the links on that page, and repeats. Web crawlers also revisit past pages once in a while to see if any changes happened.

This means any site that’s linked from an indexed site will eventually be crawled. Some sites are crawled more frequently, and some are crawled to greater depths, but sometimes a crawler may give up if a site’s page hierarchy is too complex.

One way to understand how a web crawler works is to build one yourself. We’ve written a tutorial on creating a basic web crawler in PHP, so check that out if you have any programming experience.

How Do Search Engines Work? google search on tablet


Note that pages can be marked as “noindex,” which is like asking search engines to skip its indexing. Non-indexed parts of the internet are known as the “deep web”, and some sites, like those hosted on the TOR network, can’t be indexed by search engines. (What are TOR and onion routing?)

Indexing

Indexing is when the data from a crawl is processed and placed in a database.


Imagine making a list of all the books you own, their publishers, their authors, their genres, their page counts, etc. Crawling is when you comb through each book while indexing is when you log them to your list.

Now imagine it’s not just a room full of books, but every library in the world. That’s a small-scale version of what Google does, who stores all of this data in vast data centers with thousands of petabytes worth of drives.

Retrieval and Ranking


Retrieval is when the search engine processes your search query and returns the most relevant pages that match your query.

Most search engines differentiate themselves through their retrieval methods: they use different criteria to pick and choose which pages fit best with what you want to find. That’s why search results vary from Google and Bing.

Ranking algorithms check your search query against billions of pages to determine each one’s relevance. Companies guard their ranking algorithms as patented industry secrets due to their complexity. A better algorithm translates to a better search experience.

They also don’t want web creators to game the system and unfairly climb to the tops of search results. If the internal methodology of a search engine ever got out, all kinds of people would surely exploit that knowledge to the detriment of searchers like you and me.

How Do Search Engines Work? pen html search engine meta

Search engine exploitation is possible, of course, but isn’t so easy anymore.

Originally, search engines ranked sites by how often keywords appeared on a page, which led to “keyword stuffing” — filling pages with keyword-heavy nonsense.

Then came the concept of link importance: search engines valued sites with lots of incoming links because they interpreted site popularity as relevance. But this led to link spamming all over the web. Nowadays, search engines weight links depending on the “authority” of the linking site. Search engines put more value on links from a government agency than links from a link directory.

Today, ranking algorithms are shrouded in more mystery than ever before, and “search engine optimization” isn’t so important. Good search engine rankings now come from high-quality content and great user experiences.






SEO is Science and Art

For many people who aren’t involved in search engine optimization (SEO) on a regular basis, it’s easy (or so they think). You simply create a website, write some content, and then get links from as many sources as you can.

Perhaps that works. Sometimes.

More often than not, the craft of SEO is truly a unique practice. It’s is often misunderstood and can be painfully difficult to staff for. Here’s why.

SEO is Science


By definition, “Science” is:


  • a branch of knowledge or study dealing with a body of facts or truths systematically arranged and showing the operation of general laws: the mathematical sciences.
  • systematic knowledge of the physical or material world gained through observation and experimentation.
  • any of the branches of natural or physical science.
  • systematized knowledge in general.
  • knowledge, as of facts or principles; knowledge gained by systematic study.


Anyone who has performed professional SEO services for any length of time will tell you that at any given time we have definitely practiced each of the above. In some cases, changes in our industry are so rapid that we crowdsource the science experiments among peers (via WebmasterWorld forums or Search Engine Watch forums).

Unfortunately, Google doesn’t provide step-by-step instruction for optimization of every single website. Every website is unique. Every optimization process/project is unique.

Every website represents new and interesting optimization challenges. All require at least some experimentation. Most SEOs follow strict methods of testing/monitoring/measuring so that we know what works and what doesn’t.

We have a few guidelines along the way:


  • Our “branch of knowledge” is well formed in what Google provides in their Webmaster Guidelines and SEO Starter Guide.
  • Our unique experience. Just like you might “learn” marketing by getting your bachelor’s degree in marketing, you really aren’t very good at it until you’ve worked in your field and gained real-world experience. There are so many things that you can read in the blogosphere regarding SEO that are complete crap. But, if you didn’t know any better, you’d buy off on it because “it sounds reasonable, so it must be true!” So, be careful to claim something is 100 percent “true” unless you have enough “scientific” evidence to back up the claim. Otherwise, it’s called “hypothesis”:



  1. A supposition or proposed explanation made on the basis of limited evidence as a starting point for further investigation.
  2. A proposition made as a basis for reasoning, without any assumption of its truth.

SEO is Also Art


By definition, art is:

"the conscious use of skill and creative imagination especially in the production of aesthetic objects"

I’ve worked with and befriended many incredibly bright SEOs in my years in this business. It is those who manage to blend the scientific skills with the creative thoughts on how to experiment/improve programs are the gems.

Getting creative with SEO is thinking of how a marketing program can encompass social, graphic design, link building, content generation, and PR to drive toward a common goal.

Getting creative with SEO is also about reworking a website’s design/code so that usability and accessibility improve, while maintaining brand guidelines and keeping with “look and feel” requirements, yet improving SEO.

Every day, we must get creative in determining how to best target keywords by determining which method of content generation gives us the best chance at gaining a presence in the search engines and – most importantly – engaging our audience.

Should we write a blog post? Should this be best handled in a press release? How about a video? Infographic? New “corporate” page on the site? There are a multitude of ways that we might determine to target a keyword via content.



Sunday, November 19, 2017

Importance of Search Engine in SEO?

An important aspect of SEO is making your website easy for both users and search engine robots to understand. Although search engines have become increasingly sophisticated, they still can't see and understand a web page the same way a human can. SEO helps the engines figure out what each page is about, and how it may be useful for users.

A Common Argument Against SEO

We frequently hear statements like this:

"No smart engineer would ever build a search engine that requires websites to follow certain rules or principles in order to be ranked or indexed. Anyone with half a brain would want a system that can crawl through any architecture, parse any amount of complex or imperfect code, and still find a way to return the most relevant results, not the ones that have been 'optimized' by unlicensed search marketing experts."

But Wait ...

Imagine you posted online a picture of your family dog. A human might describe it as "a black, medium-sized dog, looks like a Lab, playing fetch in the park." On the other hand, the best search engine in the world would struggle to understand the photo at anywhere near that level of sophistication. How do you make a search engine understand a photograph? Fortunately, SEO allows webmasters to provide clues that the engines can use to understand the content. In fact, adding proper structure to your content is essential to SEO.

Understanding both the abilities and limitations of search engines allows you to properly build, format, and annotate your web content in a way that search engines can digest. Without SEO, a website can be invisible to search engines.

The Limits of Search Engine Technology


Automated search bots crawl the web, follow links, and index content in massive databases. They accomplish this with dazzling artificial intelligence, but modern search technology is not all-powerful. There are numerous technical limitations that cause significant problems in both inclusion and rankings. We've listed the most common below:

Problems Crawling and Indexing:

Online forms: Search engines aren't good at completing online forms (such as a login), and thus any content contained behind them may remain hidden.

Duplicate pages: Websites using a CMS (Content Management System) often create duplicate versions of the same page; this is a major problem for search engines looking for completely original content.

Blocked in the code: Errors in a website's crawling directives (robots.txt) may lead to blocking search engines entirely.

Poor link structures: If a website's link structure isn't understandable to the search engines, they may not reach all of a website's content; or, if it is crawled, the minimally-exposed content may be deemed unimportant by the engine's index.

Non-text Content: Although the engines are getting better at reading the non-HTML text, content in rich media format is still difficult for search engines to parse. This includes text in Flash files, images, photos, video, audio, and plug-in content.

Problems Matching Queries to Content

Uncommon terms: Text that is not written in the common terms that people use to search. For example, writing about "food cooling units" when people actually search for "refrigerators."

Language and internationalization subtleties: For example, "color" vs. "color." When in doubt, check what people are searching for and use exact matches in your content.

Incongruous location targeting: Targeting content in Polish when the majority of the people who would visit your website are from Japan.

Mixed contextual signals: For example, the title of your blog post is "Mexico's Best Coffee" but the post itself is a vacation resort in Canada which happens to serve great coffee. These mixed messages send confusing signals to search engines.

Make sure your content gets seen

Getting the technical details of search engine-friendly web development correct is important, but once the basics are covered, you must also market your content. The engines by themselves have no formulas to gauge the quality of content on the web. Instead, search technology relies on the metrics of relevance and importance, and they measure those metrics by tracking what people do: what they discover, react, comment, and link to. So, you can’t just build a perfect website and write great content; you also have to get that content shared and talked about.



Friday, November 17, 2017

Types of Internet Marketing Method

There are several types of internet web marketing, some which work alone, others which work in conjunction with others. Here is a summary:

Search Engine Marketing (SEM)

A type of web marketing which promotes websites by increasing visibility in search engine results pages through search engine optimization as well as through paid advertising, strategic content marketing, and social media networks.

Search Engine Optimization (SEO)

A refinement of SEM, improving the rankings or visibility of a web page in search engines search results. SEO programs work to move targeted search results higher in rankings when results are presented to users of search engines such as Google, Bing, Yahoo!, and others. The higher the ranking when results are displayed, the more likely consumers are to click on the link and go to the targeted website.

Display Advertising

Advertising in a static, set space which is composed of images or artwork and words. Similar to ads in newspapers and magazines.

Pay Per Click Advertising

Advertising which is presented on speculation by a web publisher such as a search engine results page or a home page on a browser which only charges the advertiser for the number of times someone clicks on the ad to go to the targeted website, not the number of viewers of the advertisement.

Social Media Marketing

Marketing using social media outlets such as Facebook or other similar sites. This type of marketing includes creating pages on the site directly promoting a company, organization, or product which can be easily accessed from on or outside of the site. Social media marketing can also factor into SEO programs.

E-Mail Marketing

Marketing based on the distribution of a message via e-mail. E-Mail marketing can consist of a text message, a combination of words and images such as in a display ad, or provide access to a video on a website or public video site such as YouTube. E-mail marketing also has the ability to offer a link to a specific website to drive traffic and revenue to that site.

Referral Marketing

One of the most subtle forms of web marketing, referral marketing is based on one individual pleased enough with a website or social media site to refer it to another person, who hopefully creates a chain reaction of referrals from one group of individuals to another. Referral marketing also can be a major component of SEO programs.

Affiliate Marketing

Marketing by a third party which refers customers to a specific website or vendor. “Affiliates” market their own products, such as through a website, but have links to other websites unrelated to their site, but have some interest in common for consumers. Affiliates are rewarded for the number of times someone links from their site to the targeted site.

Inbound Marketing

A method to draw attention and visits to a website by placing information on a website others are seeking. This includes the use of providing valuable information via blogs or articles on a website or general information provided as website content beyond the main purpose of the website. When a search engine user searches for specific content, the web site containing the content is displayed, even though the content is not the main purpose of the web site. By drawing search engine users to the website through secondary information, consumers are exposed to the website and its main offerings without having specifically searched for the website.

Video Marketing

Marketing through the use of videos, such as found on YouTube or similar sites. Videos can be of any length (depending on the limitations of the site hosting the video) and have any content, message, or advocacy for a cause. Videos may be as simple as an individual standing in front of a camera talking, to full, rich production values as would be found in a movie theater. Video marketing marries the strengths of sound and moving the sight to present a powerful message.

Benefits of Internet Marketing

Convenience

Internet marketing enables you to be open for business around the clock without worrying about store opening hours or overtime payments for staff. Offering your products on the Internet is also convenient for customers. They can browse your online store at any time and place orders when it is convenient for them.

Reach

By marketing on the Internet, you can overcome barriers of distance. You can sell goods in any part of the country without setting up local outlets, widening your target market. You can also build an export business without opening a network of distributors in different countries. However, if you want to sell internationally, you should use localization services to ensure that your products are suitable for local markets and comply with local business regulations. Localization services include translation and product modification to reflect local market differences.

Cost

Marketing products on the Internet costs less than marketing them through a physical retail outlet. You do not have the recurring costs of property rental and maintenance. You do not have to purchase stock for display in a store. You can order stock in line with demand, keeping your inventory costs low.

Personalization

Internet marketing enables you to personalize offers to customers by building a profile of their purchasing history and preferences. By tracking the web pages and product information that prospects visit, you can make targeted offers that reflect their interests. The information available from tracking website visits also provides data for planning cross-selling campaigns so that you can increase the value of sales by customer.

Relationships

The Internet provides an important platform for building relationships with customers and increasing customer retention levels. When a customer has purchased a product from your online store, you can begin the relationship by sending a follow-up email to confirm the transaction and thank the customer. Emailing customers regularly with special, personalized offers helps to maintain the relationship. You can also invite customers to submit product reviews on your website, helping to build a sense of community.

Social

Internet marketing enables you to take advantage of the growing importance of social media. An article on the Harvard Business School Executive Education website highlighted the link between social networking and online revenue growth. According to the article, a group of consumers that responded most strongly to the influence of social networks generated increased sales of around 5 percent. You can take advantage of this type of influence by incorporating social networking tools in your Internet marketing campaigns.

Wednesday, November 15, 2017

What is Internet Marketing?

What is Internet Marketing?

Defining Internet Marketing

Also called online marketing, it is the process of promoting a brand, products or services over the Internet. Its broad scope includes email marketing, electronic customer relationship management and any promotional activities that are done via wireless media.

It also combines the technical and creative aspects of the World Wide Web such as advertising, designing, development, and sales. Moreover, Internet Marketing also deals with creating and placing ads throughout the various stages of customer engagement cycle.

Online marketing is divided into different types:

Affiliate Marketing

It is a marketing practice wherein a business pays an online retailer, e-commerce site or blog for each visitor or sales that these websites make for their brand.

Display Advertising

This refers to advertisement banners that are displayed on other websites or blogs to boost traffic for their own content. This, in turn, can increase product awareness.

Email Marketing

From the name itself, this is a marketing process that involves reaching out to your customers via email.

Inbound Marketing

This type of Internet marketing involves sharing of free valuable content to your target market to convince them to become your loyal customer. This could be done by setting up a business blog.

Search Engine Marketing

This is a form of marketing that promotes a business through paid advertisement that appears on search engine result pages. This includes paid placement, contextual advertising, paid inclusion or through search engine optimization.

Search Engine Optimization

Contrary to SEM, SEO uses the unpaid and natural process of promoting content on SERPs. This includes keyword research and placement, link building and social media marketing.

Social Media Marketing

Based on its name, social media marketing is the process of promoting a website through various social networks like Facebook, Twitter, Google+, LinkedIn, Pinterest and more.

Why Internet Marketing is Important?

The Internet has the power to connect millions of people from around the world. Thus, it also has the capabilities to bring your business to millions of your target market worldwide. What makes this process the best inclusion to your promotional effort is the fact that you don’t need to shell out plenty of money.

In addition, the effectiveness of your campaign can be easily measured using web analytics and cost-volume-profit analysis tools. However, it requires you to learn the many facets of Internet marketing so that you’ll know whether your efforts are giving the return on investment that you want for your business.




How SEARCH Engine Works in SEO?



Search engines have two major functions: crawling and building an index, and providing search users with a ranked list of the websites they've determined are the most relevant.

Crawling and Indexing

Imagine the World Wide Web as a network of stops in a big city subway system.

Each stop is a unique document (usually a web page, but sometimes a PDF, JPG, or another file). The search engines need a way to “crawl” the entire city and find all the stops along the way, so they use the best path available—links.


The link structure of the web serves to bind all of the pages together.

Links allow the search engines' automated robots, called "crawlers" or "spiders," to reach the many billions of interconnected documents on the web.

Once the engines find these pages, they decipher the code from them and store selected pieces in massive databases, to be recalled later when needed for a search query. To accomplish the monumental task of holding billions of pages that can be accessed in a fraction of a second, the search engine companies have constructed datacenters all over the world.

These monstrous storage facilities hold thousands of machines processing large quantities of information very quickly. When a person performs a search at any of the major engines, they demand results instantaneously; even a one- or two-second delay can cause dissatisfaction, so the engines work hard to provide answers as fast as possible.

Providing Answers

Search engines are answer machines. When a person performs an online search, the search engine scours its corpus of billions of documents and does two things: first, it returns only those results that are relevant or useful to the searcher's query; second, it ranks those results according to the popularity of the websites serving the information. It is both relevance and popularity that the process of SEO is meant to influence.

How do search engines determine relevance and popularity?

To a search engine, relevance means more than finding a page with the right words. In the early days of the web, search engines didn’t go much further than this simplistic step, and search results were of limited value. Over the years, smart engineers have devised better ways to match results to searchers’ queries. Today, hundreds of factors influence relevance, and we’ll discuss the most important of these in this guide.

Search engines typically assume that the more popular a site, page, or document, the more valuable the information it contains must be. This assumption has proven fairly successful in terms of user satisfaction with search results.

Popularity and relevance aren’t determined manually. Instead, the engines employ mathematical equations (algorithms) to sort the wheat from the chaff (relevance), and then to rank the wheat in order of quality (popularity).

These algorithms often comprise hundreds of variables. In the search marketing field, we refer to them as “ranking factors.” Moz crafted a resource specifically on this subject: Search Engine Ranking Factors.