Best 5 Inbound Marketing Tools !
by Kanika Kapoor 22:30:00 1 comments
Top 15 SEO Tools of 2015
by Kanika Kapoor 22:15:00 0 comments
Looking for Online Email Marketing Tools?
by Kanika Kapoor 20:42:00 0 comments
1. Mailchimp (Create, send and evaluate campaigns for free – 2000 subscribers)
2. Senderscore.org (Identify if the email engine is set well for performance & verify your domain reputation score – Free tool)
3. Mxtool (Are you on the blacklist? – find out – free tool)
http://mxtoolbox.com/blacklists.aspx and more
4. Who’s your registrar – not sure? find vital details related to your domain – Free Tool
5. SPF incomplete – Is it? Then use the following link and complete it. – Free Tool
6. Having trouble forming a headline? Quickly create catchy headlines for your blog or emails. – Free Tool
7. Subject-Line Test – Free Tool
8. Pre-header visibility & Email Compatibility Check – 7 days free
http://www.contactology.com/ – Free
9. What are others mailing?- sneak peek into others email template, content & analytics.
http://emailinsights.com/ – 30 days free
whosmailingwhat.com/ – Premium tool and expensive
10. Which test won? A/B Tested HTML templates from various brands
All you need to know about Google Page Rank!
by Kanika Kapoor 20:47:00 0 comments
If you do SEO or are included either with Google or search, you will experience upon this topic eventually. You'll likely be confused by what exactly PageRank implies. To unwind that, here's the best guide for PageRank, intended for searchers & site owners alike.
PageRank is an algorithm which is used by Google Search to rank websites in their search engine results.
According to Google:
PageRank works by counting the number and quality of links to a page to work out a rough estimate of however vital the web site is. The underlying assumption is that additional vital websites square measure possible to receive additional links from different websites.
PageRank may be a link analysis rule and it assigns a numerical weight to every part of a hyperlinked set of documents, like the world wide net, with the aim of "measuring" its relative importance at intervals the set. The rule is also applied to any assortment of entities with reciprocal quotations and references.
A PageRank results from a mathematical rule supported the webgraph, created by all World Wide sites as nodes and hyperlinks as edges, taking into thought authority hubs like cnn.com or usa.gov. A link to a page counts as a vote of support. The PageRank of a page is outlined recursively and depends on the quantity and PageRank metric of all pages that link to that ("incoming links"). A page that's connected to by several pages with high PageRank receives a high rank itself.
One main disadvantage of PageRank is that it favors older pages. A new page, even a very good one, will not have many links unless it is part of an existing site (a site being a densely connected set of pages, such as Wikipedia).
Basic Guide to Google's EMD Algorithm Update!
by Kanika Kapoor 22:27:00 1 comments
The EMD Update — for "Exact Match Domain" — is a channel Google launched in September 2012 to keep low quality sites from ranking admirably basically in light of the fact that they had words that match hunt terms in their area names. At the point when a new EMD Update happens, sites that have enhanced their content may recover great rankings. New sites with poor content — or those beforehand missed by EMD — may get redhanded. Also, "false positives" may get released. Our most recent news about the EMD Update is underneath.
Google has unleashed yet another algorithm as a major aspect of a series of updates that are gone for giving clients better search items and experience. This time, Google's redesign, named the "EMD update," concentrates on freeing the SERPs of spammy or low-quality "exact match" domains.
For quite a long time, SEOs have known the benefit of enrolling domain names that utilize precisely the essential words that a site is optimizing for. Case in point, if a webmaster needed a simple way to the highest point of the indexed lists for the keyword "Marketing Earth," he or she would endeavor to enlist the domain www.marketingearth.com.
Exact match domains have dependably had a colossally positive effect on rankings. Fortunate proprietors of exact match areas for profoundly trafficked keywords have since quite a while ago delighted in simple rankings and the abundance of very focused on organic search traffic that outcomes. In any case, for reasons unknown, exact match domains are frequently exceptionally spammy.
The greater part of them need quality content and rather, are loaded with keyword rich, futile articles that look incredible to a web crawler bug, yet are pointless to human perusers. Proprietors of these sites adapt them with advertisements and member links, looking after the cash and nothing for the client experience.
Presently, with the EMD algorithm update, Google has repudiated the long-standing positioning support gave by exact match areas trying to level the playing field, expel spammy sites from its list items, and yield a considerably more characteristic and semantic method for giving data through search.
What is Google's EMD Algorithmic Update?
Furthermore, how can it work? As per Matt Cutts through his tweet on September 12, EMD is situated to "reduce low-quality 'exact match' domains in list items."
It's still early, however it appears that its not proposed to wipe the search results altogether clean of websites with spammy domain names. Maybe, its planned to hold the list items under wraps for anything that could destroy the client experience.
Besides, Danny Sullivan of SearchEngineLand composed that Google affirmed that the EMD algorithm will run occasionally, so that those that have been hit stay separated or have an opportunity to escape the channel, and catch what Google what may have missed amid the last overhaul.
No doubt that Google needs its search results to be regular and free of control. What used to be one of the business' most intense ranking strategies is currently something that could endanger a site's chances for search perceivability.
Who Got Hit (and Why Should You Care)?
As indicated by the information introduced by SEOMoz, 41 EMDs from their information set of 1,000 SERPs dropped out of the main 10, with new ones seeing a precarious decrease in their rankings.
While it is clear that the EMD overhaul targets sites with exact match keywords, it seems to be extra websites that have solid brand acknowledgment and fantastic content. Websites with exact match domains that are prone to be hit are those that were clearly obtained or enlisted only for the sole purpose of positioning a site to make pain free income.
How does Google separate between low-quality EMDs and fantastic EMDs?
Right now, this inquiry is interested in theory, yet I think Google presumably utilizes the same trust pointers as it uses for whatever other site: links and social signs. Moreover, Google is improving at figuring out if on location content is low quality or high caliber with no other trust pointers.
Content that uses fitting content designing, language structure, and spelling will be reviewed higher, as will content that utilizes valuable inner and outside linking. The destination of the outer links matter, as well. Links to domains that Google considers low-quality or spammy or in a "terrible neighborhood" will really bring about your content to lose focuses in the positioning calculation.
By what method would I be able to recuperate or guarantee my EMD site doesn't get hit by the new EMD calculation?
Here's an orderly process for shielding (or recouping) your EMD space:
Step 1: Remove or enlarge all content on your EMD site that could be thought to be low quality. Ask yourself whether the content is composed for web search tools or gives certifiable quality to your perusers. Be fair.
Step 2: Get an inbound links profile review to distinguish spammy inbound links that could be yielding negative trust signs to Google, then participate in a link evacuation battle to endeavor to evacuate whatever number of them as would be prudent.
Step 3: Add social offer catches to the majority of your content, in the event that you don't have them as of now.
Step 4: Get in a routine of consistently including new, amazing content to your site (more is constantly better, however I suggest once per day). In the event that you don't have room schedule-wise to compose your own particular content, outsource it to an expert author.
Step 5: Engage in a SEO external link establishment crusade to expand your site's validity and power. Visitor blogging administrations are accessible to help with quality, moral third party referencing strategies that are supported by Google and Bing.
Step 6: Engage in an online networking advertising battle to increase "social evidence" through social signs.
What's on the horizon?
We can think about the EMD redesign as a sidekick upgrade for Panda and Penguin. Review that the Panda redesign particularly targets websites with low quality or dainty content. With the EMD intermittently "filtering" Google's record for spammy space names alongside Panda doing its employment, we'll soon see the SERPs populated with more significant and superb sites.
We'll likewise keep on seeing huge amounts of squandered exertion put into SEO methodologies that were once acknowledged and functioned admirably, yet are currently relics of times gone by. I envision it'll take months or years before numerous people quit sustaining obsolete strategies and procedures.
What is an Exact-Match Domain?
An exact match space is an domain name that precisely coordinates the searched keyword phrase of a client and contains no dashes. Case in point, on the off chance that you look Google for the keyword phrase "bottles," then bottles.com would be the exact match area name.
Cases of Exact-Match Domains
The most flawless sample of a exact match space name is a solitary nonexclusive word that characterizes an item, administration or industry, however exact match domain names reach out to different words – regularly called long-tail search queries.
A huge number of illustrations exist for both single and various words and they are possessed by little and expansive organization.
What Everyone Ought To Know About Google Hummingbird
by Kanika Kapoor 21:56:00 1 comments
Google Hummingbird is a search algorithm utilized by Google. It is basically derived from being " precise and fast".
Google began utilizing Hummingbird about August 30, 2013 and reported the change on September 26 on the eve of the organization's 15th anniversary.
The Hummingbird redesign was the first significant update to Google's search algorithm since the 2010 "Caffeine Update", yet even that was constrained fundamentally to enhancing the indexing of data as opposed to the sorting of data. Google search chief Amit Singhal expressed that Hummingbird is the first significant redesign of its type since 2001.
Conversational search influences natural language, semantic search and more to enhance the way search queries are parsed. Unlike previous search algorithms which would concentrate on every individual word in the search query, Hummingbird considers every word additionally how every word makes up the sum of the query — the whole sentence or discussion or significance — is considered, as opposed to specific words.
Much like an expansion of Google's "Knowledge Graph", Hummingbird is gone for making interactions more human — equipped for comprehension the ideas and relationships between keywords.
Hummingbird places more noteworthy accentuation on page content making search results more applicable and germane and guaranteeing that Google conveys clients to the most fitting page of a site, instead of to a landing page or top level page.
SEO got little changes with the expansion of Hummingbird, however the more top ranking results are ones that give regular content that peruses conversationally. While keywords inside of the query still keep on being vital, Hummingbird adds more quality to since a long time ago tailed keywords — adequately pander to the optimization of content instead of just keywords. Webmasters will now need to cater towards questions that are solicited normally; with the developing number from conversational inquiries — to be specific those utilizing voice search, focusing on expressions that begin with "Who, Why, Where, and How" will demonstrate useful towards SEO. Utilization of Keyword Synonyms have additionally been upgraded with Hummingbird; as opposed to posting results with precise phrases or keywords.
What You Need To Know About Google Penguin ?
by Kanika Kapoor 21:54:00 0 comments
Google initially propelled Penguin back in April, 2012. They acquainted the algorithm to fight webspam in web search engine. The Google Penguin upgrades fundamentally try to avert different types of web search engine spam (also called spamdexing or Black Hat SEO) from being effectively compensated as higher-set web search engine results. Web Search Engine spam can incorporate activities- For example: Link spamming, the utilization of imperceptible content on website pages, Keyword stuffing, duplication of copyrighted content from high-ranking sites and that's only the tip of the iceberg.
How Often Are Google Penguin Updates Rolled Out?
Google revealed the first Penguin 1. 0 in April 2012 and the search organization assessed it influenced 3 percent of all English-language web sites. When you consider the quantity of search questions Google gets on any given day, that is an immense number.
Google doesn't generally report changes for Penguin, however there have been no less than five Google Penguin updates, including a noteworthy upgrade, Penguin 2.0, in May 2013. They added new signals to this upgrade to combat a percentage of the black hat techniques they hadn't got in the before one. This time, it influenced 2.3 percent of all questions.
Furthermore, the latest, Penguin 2.1, soon thereafter in October, Many individuals expected that, in light of the long postpone, Google would add all the more new signals to the algorithm. Nonetheless, from what Far has said, it is by all accounts just an information revive. Indeed, Penguin 3.0 will influence around 1 percent of all English inquiries. It's still a major number, however not as large as one would expect after such a long hold up.
Instructions to stay Penguin-Proof
Google's primary goal with their web search tool is to furnish users with the best conceivable results. In the event that Google returned spammy websites in the outcomes, individuals would quit utilizing it.
The most ideal approach to stay secured against future algorithms is to stay far from spammy practices. Try not to attempt to diversion the framework by assembling links and over-advancing your site for keywords.
You should simply make quality content that individuals will love to peruse. These individuals will then tell their friends on social network and word about your site will spread. Before long, different websites will link to your content and you'll naturally fabricate joins that way.
With astonishing content and an adaptable effort process, you'll get to the highest point of Google and you'll stay there, paying little respect to any calculation upgrades.
How Does Google Penguin Differ from Google Panda and Google Hummingbird?
While Google Penguin offers similitudes with two other algorithmic upgrade ventures from Google, Google Panda and Google Hummingbird, Penguin's particular center is on punishing organizations and web designers that purposely attempt to "support" their web search engine rankings through manipulative SEO strategies.
Google Panda, then again, particularly targets low quality content sites by minimizing them in the list items so that higher quality websites can get more unmistakable results.
The third venture, Google Hummingbird, concentrates on presenting a totally new search algorithm instead of Google Penguin and Panda, which both serve as upgrades for Google's current search algorithm engine.
What is Google Panda Algorithm?
by Kanika Kapoor 21:54:00 2 comments
Google Panda is a change to Google's search results positioning algorithm that was initially released in February 2011. The change expected to bring down the rank of "low-quality websites" or "meager sites" and return higher-quality websites close to the highest point of the indexed lists. CNET reported a surge in the rankings of news sites and social networking websites and a drop in rankings for websites containing a lot of advertising. This change allegedly influenced the rankings of very nearly 12 percent of all search results. Soon after the Panda rollout, numerous sites, including Google's webmaster gathering, got to be loaded with grievances of scrubbers/copyright infringers showing signs of improvement rankings than websites with unique article. At a certain point, Google freely requested information that focuses to help recognize scrubbers better.
How Often Are Google Panda Updates Rolled Out?
The first Panda redesign appeared in February 2011, and no less than three extra significant overhauls have taken after, with the latest being May 2014's Panda 4.0 Update. The organization additionally has a background marked by taking off minor upgrades, once in a while as much of the time as month to month.
The Panda overhauls are firmly trailed by the Search Engine Optimization (SEO) industry and additionally organizations and web engineers over the world, as the Panda changes can altogether effect the measure of movement a site gets from normal or natural, indexed lists.
Google offers consistent advisory reports on its blog to encourage give guidance to SEO organizations, web designers and content suppliers for enhancing the content and outline of their sites and pages to abstain from being brought down or punished, in the web search engine results.
Google Panda is a channel that forestalls low quality websites and/or pages from positioning great in the web search engine results page. The channel's edge is impacted by Google Quality Raters. Quality Raters answer inquiries, for example: "would I believe this site with my Visa?" so that Google can recognize the distinction in the middle of high and low quality sites.
The Google Panda recorded on September 28, 2012, was allowed on March 25, 2014. The patent expresses that Google Panda makes a proportion with a site's inbound links and reference inquiries, scan questions for the site's image. That proportion is then used to make a sitewide change factor. The sitewide adjustment factor is then used to make an alteration factor for a page based upon a search inquiry. On the off chance that the page neglects to meet a certain edge, the change variable is connected and, accordingly, the page would rank lower in the internet searcher results page.
Google Panda influences the positioning of a whole site or a particular area as opposed to the individual pages on a site.
In March 2012, Google upgraded Panda.Google says it just takes a couple pages of low quality or copied content to hold down activity on a generally strong website, and suggests such pages be uprooted, hindered from being listed by the web crawler, or rewritten.However, Matt Cutts, head of webspam at Google, cautions that rewriting copy content so it is unique may not be sufficient to recoup from Panda, the revamps must be of adequately high caliber, accordingly content brings "extra esteem" to the web. Content that is general, non-particular, and not considerably not quite the same as what is as of now out there ought not be relied upon to rank well: "Those different websites are not bringing extra esteem. While they're not copies they don't convey anything new to the table."
How Does Google Panda Differ from Google Penguin and Google Hummingbird?
Google Panda is as often as possible mistook for two other algorithm upgrades from Google, Google Penguin and Google Hummingbird. Google Panda overhauls are concentrated basically on guaranteeing that low quality and poor content sites are pushed more distant down the search results so that higher quality websites can receive priority.
Google Penguin upgrades, then again, target sites that use Black Hat SEO trying to support their web search tool results. These websites rupture the Google Webmaster Guidelines and accordingly, Google Penguin redesigns punish these destinations in the web index's outcomes.
While Google Panda and Penguin both serve as redesigns for Google's current search algorithm engine, Google Hummingbird conveys a totally new search algorithm. Google Hummingbird looks to enhance the web crawler experience for clients by going beyond keyword focus and rather considering a greater amount of the links and encompassing content in the whole search phrase to offer a characteristic dialect or conversational, way to deal with search queries.
Why Google penalizes a Website?
by Kanika Kapoor 22:47:00 0 comments
Google penalizes websites for taking part in practices that are against its Webmaster guidelines. These penalties can be the consequence of a manual audit or algorithm upgrades, for example, Google Penguin.
Google penalties can bring about the drop of rankings for each page of a site, for a particular keyword or for a particular page. Any drop in rankings carries with it a noteworthy drop in traffic for the site.
To see whether a site has been influenced by a Google penalty, website proprietors can utilize Google Webmaster Tools and additionally break down the timing of their traffic drop with the timing of known Google updates.
Google PenaltiesGoogle has been overhauling its algorithm for whatever length of time that it has been battling the control of natural indexed lists. Then again, up until May 10, 2012, when Google propelled the Google Penguin update, numerous individuals wrongly accepted that low-quality backlinks would not contrarily influence ranks. While this perspective was normal, it was not right, as Google had been applying such link based penalties for a long time, however not made open how the organization drew nearer and managed what they called "link spam". Since this time there has been a much more extensive affirmation about the perils of awful SEO and a measurable analysis of backlinks to guarantee there are no destructive links.
Link based PenaltiesPenalties are by and large brought on by manipulative backlinks that are planned to support specific organizations in the search results; by including such links organizations broke Google's terms and conditions. At the point when Google finds such connections, it forces penalties to demoralize different organizations from taking after this practice and to uproot any additions that may have been delighted in from such links. Google additionally penalizes the individuals who took part in the control and helped different organizations by linking to them. These sorts of organizations are regularly low-quality registries which basically recorded a link to an organization site with manipulative grapple content for a charge. Google contends that such pages offer no quality to the Internet and are frequently deindexed thus. Such links are regularly alluded to as paid links.
Types of link spam
1. Paid linksPaid links are basically links that individuals put on their site for a charge as they accept this will have a positive effect on the search results. The act of paid links was extremely mainstream before the Penguin upgrade when organizations accepted they could include any sorts of links with exemption since Google guaranteed former that time that they just disregarded such links they distinguished as opposed to penalizing sites. To agree to Google's late TOS it is basic to apply the nofollow ascribe to paid commercial links.
2. Blog networksBlog networks are a gathering of infrequently a huge number of blogs that mean to seem detached which then links out to those arranged to pay for such links. Google have normally focused on blog network and once recognizing them have penalized a great many sites who picked up advantages.
3. Comment spamThese are links left in the remarks of articles that are difficult to have uprooted, as this practice turned out to be so boundless Google launched something many refer to as the NOFOLLOW tag which blog stages immediately consolidated to help check such practices. The NOFOLLOW tag basically advises web search tools not to trust such links.
4. Guest blog postsGuest blog entries got to be mainstream as a work on taking after penguin as these were viewed as 'white hat' strategies for some time. In any case, Google has following expressed that they consider these links to be spam.
Dealing a penaltyGoogle has urged organizations to change their terrible practices and accordingly request that endeavors are taken to evacuate manipulative links. Google propelled the Disavow tool on 16 October 2012 so that individuals could answer to Google the awful links they had. The Disavow tool was propelled essentially because of numerous reports of negative SEO, where organizations were being focused with manipulative links by contenders knowing very well indeed that they would be penalized as a result. There has been some controversy about whether the Disavow tool has any impact when control has occurred over numerous years. In the meantime, some narrative contextual investigations have been presented which propose that the tool is viable and that previous ranking positions can be restored.
Negative SEONegative SEO began to happen taking after the Penguin upgrade when it got to be normal learning that Google would apply penalties for manipulative links; such practices as negative SEO have brought on organizations to be determined in observing their backlinks to guarantee they are not being focused by threatening contenders through negative SEO administrations.
Search Engine Penalties and Recoveries
by Kanika Kapoor 00:34:00 0 comments
In the event that your site rankings have dropped as of late, there are a couple of various types of penalties that could influence your site. Google's two fundamental algorithms that focus seek rankings are called Panda and Penguin. Panda is the algorithm that decides the quality and pertinence of your site's content, as it identifies with focused on keywords.
In any case, your site is frequently the soul of a business; its your virtual storefront. Having it vanish from Google (which is by a wide margin the most prominent web index) is impossible for most organizations. On the off chance that you've been penalized, making a point to recover the right path, in view of long haul objectives, is the best practice.
Google Panda Penalty Recovery
Websites that get penalized by Google's Panda algorithm get hit for one of two reasons: dainty content or copy content. This implies that your site pages are not exceptionally spellbinding, useful or are duplicates of different pages on your webpage (or more awful, different sites). This sort of penalty is the simpler one to recover from, as its controlled inside and Google's upgrades happen all the more often.
Google Penguin Penalty Recovery
In the event that your site was abruptly hit hard and your rankings dropped drastically, chances are that its identified with Google's Penguin algorithm. This is the most troublesome penalty to recover from and shockingly, frequently the most widely recognized. Low quality SEO firms regularly add a site to private online journal systems, low-quality (or fake) registries and fake sites with an end goal to trap the internet searcher into speculation your site is prevalent. This used to work well and even today, a few organizations see prompt results – yet its not an enduring arrangement. The Penguin algorithm inevitably makes up for lost time, cheapens the sites and penalizes sites that utilize these strategies.
What is Search Engine's Algorithms?
by Kanika Kapoor 02:29:00 0 comments
Unique to every search engine, and just as important as keywords, search engine algorithms are the why and the how of search engine rankings. Basically, a search engine algorithm is a set of rules, or a unique formula, that the search engine uses to determine the significance of a web page, and each search engine has its own set of rules. These rules determine whether a web page is real or just spam, whether it has any significant data that people would be interested in, and many other features to rank and list results for every search query that is begun, to make an organized and informational search engine results page. The algorithms, as they are different for each search engine, are also closely guarded secrets, but there are certain things that all search engine algorithms have in common.
1. Relevancy – One of the first things a search engine algorithm checks for is the relevancy of the page. Whether it is just scanning for keywords, or looking at how these keywords are used, the algorithm will determine whether this web page has any relevancy at all for the particular keyword. Where the keywords are located is also an important factor to the relevancy of a website. Web pages that have the keywords in the title, as well as within the headline or the first few lines of the text will rank better for that keyword than websites that do not have these features. The frequency of the keywords also is important to relevancy. If the keywords appear frequently, but are not the result of keyword stuffing, the website will rank better.
2. Individual Factors – A second part of search engine algorithms are the individual factors that make that particular search engine different from every other search engine out there. Each search engine has unique algorithms, and the individual factors of these algorithms are why a search query turns up different results on Google than MSN or Yahoo!. One of the most common individual factors is the number of pages a search engine indexes. They may just have more pages indexed, or index them more frequently, but this can give different results for each search engine. Some search engines also penalize for spamming, while others do not.
3. Off-Page Factors – Another part of algorithms that is still individual to each search engine are off-page factors. Off-page factors are such things as click-through measurement and linking. The frequency of click-through rates and linking can be an indicator of how relevant a web page is to actual users and visitors, and this can cause an algorithm to rank the web page higher. Off-page factors are harder for web masters to craft, but can have an enormous effect on page rank depending on the search engine algorithm.
Search engine algorithms are the mystery behind search engines, sometimes even amusingly called the search engine’s “Secret Sauce”. Beyond the basic functions of a search engine, the relevancy of a web page, the off-page factors, and the unique factors of each search engine help make the algorithms of each engine an important part of the search engine optimization design.
Do-Follow Backlinks VS No-Follow Backlinks
by Kanika Kapoor 22:10:00 0 comments
Many individuals don't have the foggiest idea about the real distinction between rel="nofollow" and rel="dofollow" links.
A typical hyper link in which Google and other web search engine bots follows through the link is do follow link. I know this is too short to get it.
Here is an illustration – To rank in Google for a specific keyword you require a few points and the points are number of backlinks indicating your site (inbound links) and Google considers just the link that can be taken after and that is do follow links. More do follow links indicating your site means more the points and more the points that means you rank well.
It isn't so much that all do follow links gets you points. Google has a few elements when positioning a site and one that it considers to give more indicates a site is by PR (Page Rank). A do follow link indicating your site from a high power site (ex: getting a link from BBC)
Example of Do-Follow Link
<a href=”http://www.crunchdigital.in/”>Crunch Digital</a>
It is not required to put rel="dofollow" link because by default hyperlinks are dofollow.
No-follow links attributes don't permit internet searcher bots to take after link.That implies if the site proprietor is linking back to you with no-follow properties, it doesn't go on link juice. Just Humans will have the capacity to take links. In spite of the fact that sooner or later back Google made it clear that they don't consider no-follow link characteristics yet weightage of such links are truly less. Despite the fact that, its a decent practice to utilize No-follow link credit to those link, where you would prefer not to pass link juice. In Short, a link that doesn't permits web crawlers bots to complete are no follow links. A no follow link is included with rel="nofollow" trait.
Example of No-Follow Link
<a href=”http://www.crunchdigital.in/” rel=”nofollow” >Crunch Digital</a>
As indicated by Google webmaster, both the links are useful for SEO yet as contrast with nofollow, dofollow link has more significance.