What is hidden text in seo

What is hidden text in seo

Read Seo Blog, Seo onpage optimization tutorial blog

Hidden text

Hidden text is also known as “invisible text” or “fake text.” Hidden text is often used for spamming the search engines. But many smart search engines, such as Google can detect the use of hidden text.

  • Hidden text is text on a Web page which is visible to search engine spiders but not visible to human visitors.
  • Hiding text in your content to manipulate Google’s search rankings can be seen as deceptive and is a violation of Google’s Webmaster Guidelines. When evaluating your site to see if it includes hidden text or links, look for anything that’s not easily viewable by visitors of your site. Are any text or links there solely for search engines rather than visitors? However, not all hidden text is considered deceptive.

For example, if your site includes technologies that search engines have difficulty accessing, like JavaScript, images, or Flash files, using descriptive text for these items can improve the accessibility of your site. Remember that many human visitors using screen readers, mobile browsers, browsers without plug-ins, and slow connections will not be able to view that content either and will benefit from the descriptive text as well. You can test your site’s accessibility by turning off JavaScript, Flash, and images in your browser, or by using a text-only browser such as Lynx

How to Use Heading Tags for SEO

How to Use Heading Tags for SEO

Read Seo Blog, Seo onpage optimization tutorial blog

What is Heading tags in SEO and How to Use.

Heading tags, as their name suggests, are used to differentiate the heading of a page from the rest of the content. These tags are also known to webmasters as HTML header tags, head tags, heading tags and SEO header tags. The most important heading tag is the h1 tag and least important is the h6 tag. In HTML coding the header tags from h1 to h6 form a hierarchy. This means that if you skip any of the tag numbers (ie. jump from 1 to 3) the heading structure will be broken, and this is not ideal for on-page SEO.

For example, if your site is introduced with a heading in h1, and a sub-heading in h3, the hierarchy will be broken, meaning the heading structure is not SEO-friendly. The coding should be something like what is shown below:

heading tag example

 

The h1 tag is the most important tag. Every page must have an h1 tag.

Advantages of Using Heading Tags:

The heading tag is used to represent different sections of web page content. It has an impact on both the SEO and usability of your site.

Header tags from an SEO point of view:

  • Relevancy: Search engine spiders check the relevancy of the header tag with the content associated with it.
  • Keyword Consistency: The search engine spiders check the keyword consistency between the header tags and other parts of the page.
  • The Importance of an h1 Tag: The h1 is the most important tag and it should never be skipped on a page. Search spiders pay attention to the words used in the h1 tag as it should contain a basic description of the page content, just as the page <title> does .
  • Enriched User Experience: Heading tags give the user a clear idea of what the page content is about. Search engines give much importance to user-experience on a site, meaning the presence of heading tags becomes an important component of SEO.

Header tags from a usability point of view:

  • For users of the web who must use a screen reader, it is easier to navigate sections of content by referring to properly structured headings on a page.
  • The h1 heading tag (main heading) of a page gives users a quick overview of the content that is to follow on the page.
  • By reading the different heading tags, users can scan a page and read only the section they are interested in.
  • The primary use of heading tags is for SEO, not to gain the larger, more prominent fonts; but the presentation of a web page does look cleaner with the presence of these tags.

Things you should not be doing with heading tags:

  • Do not stuff your heading tags with keywords.
  • Do not use more than one h1 tag on a page unless really necessary. Usually pages will have a single h1 heading and including two might make search engines think this as an attempt to include more keywords for SEO in multiple h1 tags. It is better to divide the content into two separate topics on individual pages with their own h1 tags. This makes more sense to both readers and the search engine spiders, however, using multiple h1 tags is allowed.
  • Do not use heading tags as hidden text. Any hidden text can result in penalties for your site, especially if the hidden part is a component that effects SEO.
  • Do not repeat heading tags on different pages of your site. It is a good practice to have unique heading tags throughout your site.
  • Do not use the same content in your page’s h1 tag as in your meta title tag.

Do not use heading tags for styling text but use them for presenting organized and structured content on pages. Use CSS stylesheets for the purpose of styling

what is Image alt tags optimization

what is Image alt tags optimization

Read Seo Blog, Seo onpage optimization tutorial blog

Image Alt Tags Checker tool helps you to find images count on webpage and provide information about alt tag missed images. Alt tag is must needed for every images on your webpage. It must be the title and must related to the image. Google also suggests putting alt tag on images because using alt tags Google serve better image results to the user. Using Title tags in images and anchor tags also highly appreciable by search engines and information to user.

Tips on putting alt tags for images

  • Put alt tag text related to the image not just for fulfil this alt tag criteria.
  • It is hard to put alt tags after developing website so start putting alt tags for images while developing a website
  • If you missed out already just check with our tool and put alt tag for all images on your webpage.

If you look up the img tag (which is really called img element) in the HTML specifications you will see that alt is listed as an attribute.

 

 

What is website canonical issue in seo and how to resolve it

What is website canonical issue in seo and how to resolve it

Read Seo Blog, Seo onpage optimization tutorial blog

What is website canonical issue in seo and how to resolve it.

A canonical issue arises when 301 redirects are not properly in place. This means that your website can be accessed by search engines from several different URLs. This means that search engines can then potentially index your site under different URLs, meaning that it will look like a site of duplicated content.

For example if you have the website http://www.abc.com then the index page can be accessed from all of the following URLs:

http://www.abc.com

http://www.abc.com/index.html

http://abc.com

http://abc.com/index.html

caonical issue resolve

What can be done to resolve the canonical issue?

The best and most effective way to resolve the canonical issue is with a permanent 301 redirect. This can be implemented in a number of ways, as detailed below. Depending on what server your website is hosted on will determine the method which you use to implement a redirect.

In addition to this it is worth also logging into Google Webmaster Tools and set-up two profiles for your domain; one with the www. prefix and one without. Then go to Site “Configuration> Settings> Preferred Domain” and choose which domain you would like Google to use.

How to implement a 301 redirect with a .htaccess file

If you have your website hosted on any of the below server types then you will be able to use a .htaccess file:

  • Linux
  • Apache
    • Zeus
    • Sun Java

These are the most common hosting servers and are also the easiest to implement a permanent 301 redirect. Simply copy the code into your existing .htaccess file if you can one or open a blank notepad document and save it as .htaccess

Options +FollowSymLinks

RewriteEngine on

RewriteCond %{HTTP_HOST} ^ example.com [NC]

RewriteRule ^(.*)$ http://www.abc.com/$1 [L,R=301]

 

RewriteCond %{REQUEST_URI} ^(.*)//(.*)$

RewriteRule . http://www.abc.com%1/%2 [R=301,L]

RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /(([^/]+/)*)index\.html\ HTTP/

RewriteRule index\.html$ http://www.abc.com/%1 [R=301,L]\

You will need to change the abc.com domain name to your domain name (all bits highlighted in yellow) and you may also need to change the text highlighted in green. Depending on your site you have a .php index page or it may not be called index, either way check this on your website and change accordingly.

 

Once the code has been edited and copied into the .htaccess file, save it and upload it to the root of the domain (the same location as the index page). These two pieces of code will redirect anyone who accesses the site to a URL which includes the www. prefix and from the /index.html to the root domain.

 

Setting up a 301 redirect using Windows server

If you host your website on a Windows server you will need to have administrative access to the hosting server and will need to set-up the 301 redirect through IIS.

Go to “All Programs>Administrative Tools>Internet Information Services”

Navigate to the domain and right click on it, then select “Properties”

Click on the “Home Directory” tab

Select the radial button “A redirection to a URL”

Then enter the URL you want to redirect to (e.g. http://www.abc.com)

Click “OK”

This will redirect the domain.

 

Word of warning!

Double check that the domain names are correct when implementing a permanent 301 redirect, and then double check them again. Once implemented test that the redirect is working properly, make sure that you refresh the page several times to ensure you are not viewing a cached page. The reason this is vital is because using a .htaccess file or IIS incorrectly could result in your website being brought down.

What is Robots.txt file

What is Robots.txt file

Read Seo Blog, Seo onpage optimization tutorial blog
  • What is Robots.txt file and how to implement in website

  • Robots.txt is a text (not html) file you put on your site to tell search robots which pages you would like them not to visit. Robots.txt is by no means mandatory for search engines but generally search engines obey what they are asked not to do. It is important to clarify that robots.txt is not a way from preventing search engines from crawling your site (i.e. it is not a firewall, or a kind of password protection) and the fact that you put a robots.txt file is something like putting a note “Please, do not enter” on an unlocked door – e.g. you cannot prevent thieves from coming in but the good guys will not open to door and enter. That is why we say that if you have really sen sitive data, it is too naïve to rely on robots.txt to protect it from being indexed and displayed in search results.
  • Impact on any website

A robots.txt file is a set of instructions for search engines, listing URLs or files that should not be crawled or indexed. This tool to make it easy for webmasters to make valid robots.txt files. When a site owner wishes to give instructions to web robots they place a text file called robots.txt in the root of the web site hierarchy (e.g. https://www.example.com/robots.txt). This text file contains the instructions in a specific format. Robots that choose to follow the instructions try to fetch this file and read the instructions before fetching any other file from the web site. If this file doesn’t exist, web robots assume that the web owner wishes to provide no specific instructions, and crawl the entire site. A robots.txt file on a website will function as a request that specified robots ignore specified files or directories when crawling a site. This might be, for example, out of a preference for privacy from search engine results, or the belief that the content of the selected directories might be misleading or irrelevant to the categorization of the site as a whole, or out of a desire that an application only operate on certain data.

Ø  Structure of a Robots.txt File

The structure of a robots.txt is pretty simple (and barely flexible) – it is an endless list of user agents and disallowed files and directories. Basically, the syntax is as follows:

User-agent:

Disallow:

User-agent” are search engines’ crawlers and disallow: lists the files and directories to be excluded from indexing. In addition to “user-agent:” and “disallow:” entries, you can include comment lines – just put the # sign at the beginning of the line:

# All user agents are disallowed to see the /temp directory.

User-agent: *

Disallow: /temp/

How to check website Duplicate content

How to check website Duplicate content

Read Seo Blog, Seo onpage optimization tutorial blog

How to check website Duplicate content.

Copy cape is a free plagiarism checker. The software lets you detect duplicate content and check if your articles are original.

Impact on any website:  

  • Duplicate content is a huge topic in the search engine optimization (SEO) space; Check, we even have a category devoted to the topic. But should we worry about it? Google’s head of search spam. Google’s Panda Update meant to stop sites with poor quality content from working their way into Google’s top search results. Panda is updated from time-to-time. When this happens, sites previously hit may escape, if they’ve made the right changes. Panda may also catch sites that escaped before. The Panda algorithm, which was designed to help boost great-quality content sites while pushing down thin or low-quality content sites in the search results, has always targeted scraper sites and low-quality content sites in order to provide searchers with the best search results possible.
  • Duplicate webpage tools :- http://copyscape.com/
  • copyscapeCopyscape Plagiarism Checker
What is html sitemap in seo

What is html sitemap in seo

Read Seo Blog, Seo onpage optimization tutorial blog

What is HTML  sitemap in Seo

HTML Sitemap:-  An HTML sitemap allows site visitors to easily navigate a website. It is a bulleted outline text version of the site navigation. The anchor text displayed in the outline is linked to the page it references. Site visitors can go to the Sitemap to locate a topic they are unable to find by searching the site or navigating through the site menus

  • Impact on any website

An HTML site map is a single HTML page that contains links to all the pages of your website. Normally, this is accessible via a link in your site footer, where it will be displayed on every page. Sitemaps make navigating your site easier and having an updated sitemap on your site is good both for your users and for search engines. It will help users better understand site content and improve how search-engine crawl bots find deep site pages.

What is xml sitemap in seo

What is xml sitemap in seo

Read Seo Blog, Seo onpage optimization tutorial blog

What is xml sitemap in seo

The Sitemaps protocol allows a webmaster to inform search engines about URLs on a website that are available for crawling. A Sitemap is an XML file that lists the URLs for a site. Or XML Sitemap is a structured of our website.

  • Impact on any website:- From an SEO perspective, as the search engine’s robot (or spider) crawls your site indexing pages. An XML Sitemap is a structured format that a user doesn’t need to see, but it tells the search engine about the pages in a site, their relative importance to each other, and how often they are updated. This helps visitors and search engine bots find pages on the site.  If it is not present at the website we will create and upload at the root of the server.

Why is it important?: –   Sitemaps allow search engines to find all of your webpage’s, that they might otherwise miss when indexing. The XML sitemap allows you to specify additional information about each URL such as:

 How To generate a XML sitemap?: Generating a XML sitemap for you website is a simple process, and there are many websites that can help you do so. Google recommends using http://xml-sitemaps.com

           http://www.seorankexpert.in/sitemap.xml

sitemap

 

                                                        

What is meta tags in seo

What is meta tags in seo

Read Seo Blog, Seo onpage optimization tutorial blog

What is meta tags in seo

Meta Tags:    A Meta tag is a tag (that is, a coding statement) in the Hypertext Markup Language. Meta tags are specialized tags that contain data about the contents of a website or webpage. The meta tag is placed near the top of the HTML in a Web page as part of the heading.

  • Impact on any website: Meta tags are a great way for webmasters to provide search engines with information about their sites. Meta tags can be used to provide information to all sorts of clients, and each system processes only the meta tags they understand and ignores the rest.

 

  • Meta tags contains :

 Meta Title

 Meta Description

 Meta Keywords

 

Meta title Optimal Length for Search Engines: Google typically displays the first 50-60 characters of a title tag.

As Example: <head>
<title>Best digital marketing agency delhi</title>
</head>

  • Meta description Optimal Length for Search Engines: Here define to our web page information and  use to meta description standard Length  Roughly 150 to 160  Characters .

As Example: <head>
<meta name=”description” content=”Here is a description of the applicable page“>
</head>

  • Meta Keyword : Meta Keywords are a specific type of meta tag that appear in the HTML code of a Web page and help tell search engines what the topic of the page is.

And here define to page keyword and use to 5 To 10 keywords.

As Example: <meta name=”keywords” content=”digital marketing agency, online promotion company,seo consultant,”>