Top SEO Interview Questions And Answers! - Digital Marketing Interview Questions And Answers For Basic and Advance Level

Wednesday, 28 September 2016

Top SEO Interview Questions And Answers!



1) What is SEO?

SEO stands for Search Engine Optimization and It is a process/ technique to optimize your website or WebPages according to improve its SERP (Search Engine Result Pages) Ranking.

2) How many types of SEO, describe?

Generally SEO techniques categorized in two parts

On-Page SEO
Off-Page SEO
Onpage SEO contains all measures (that search engines took from the website) on your website coding that is fully controlled by the developer. It contains title, meta keywords, meta descriptions, heading tags, content optimization etc. In simple words any changes that made within the websites ranging from website navigation structure to content placement strategy is known as on-page optimization.

Offpage SEO Activities that are done outside your website to improve it’s position in Google search result is called off page SEO. Generally all link building activities comes in this category. Below is the list of few off page methods.

Directory Submission
Social Bookmarking
Search Engine Submission
Profile Submission
Business Listing
Press Release
Article Submission
Social Media Submissions
3) Which is the popular tool to track your website visitors?

Google Analytics - This tool is Freemium (Free + Premium) that tracks and reports Website Traffic, like how many visitors are online, from which location, how much time they spend time to particular page. You can check your website traffic in different segments like traffic from organic search, paid search or direct search. You can also monitor your traffic from different devices like Desktop, Mobile, Tablest, I-Phones with search keywords and landing pages.

4) What is Organic Search and Organic Search Result?

The searching made through the Search Engines Query box known as Organic Search and the listing of pages that appears below the query box is called organic search result.

5) What is Black Hat SEO and White Hat SEO?

Black Hat SEO refers the technique to improve website ranking by breaking the rules, policies of Search Engines. Hidden text, Cloaking, Spamdexing etc are some well known Black Hat SEO practices.

White Hat SEO refers the technique to improve website ranking by following the rules, policies of Search Engines like quality content, related Title, Meta tags, H1, etc.

6) What are the backlinks?

All links outside from your own website that link back to your website or webpage are called backlinks.

7) What is Googlebot?

Googlebot is software or a program that automatically crawled, indexed and cached the new or old webpages over the internet and adds them to Google index. It is also known as spider or crawler.

8) What is robots.txt?

It's a special kind of text file containing the instruction for Crawlers to crawl webpage(s), domains, files or directory. We can also instruct to crawlers to not do crawl particular webpage(s), domains, files or directory.

9) Where robots.txt file uploaded to hosting directory?

A robots.txt file is uploaded in home/ root folder on the hosting server.

10) Can webpage extension effects the SEO?

No, extensions do not effect to the SEO. You can use .html, .htm, .asp, .aspx, .php etc depending on used technology to create website.

11) What is Custom 404 page?

When broken link found in the website (i.e. any page not found) then Server returns error 404 that means page not found. To avoid such kind of broken link error, we direct website on a page that page known as Custom 404 page.

12) How should be the page title?

Page title is an important aspect of Search Engine Optimization and It should be Unique, Accurate and not more 70 characters.

13) What is Meta Description and how it should be?

Meta Description tag provides the summary about page to the Search Engine, it is the second important thing after the page title. The content for Meta Description tag must be unique and not matched to other page's Meta Description tag. It must be related to the content available on web page. Do not fill description with the keywords.

14) Where we should put keywords in the web page?

Keywords should be in Title tag.
Keywords should be in Meta Description tag.
Keywords should be in H1, H2 and H3 (if possible) tag.
Keywords should be in first paragraph of the web page.
Other paragraphs also contains the keywords.
Keywords should be in Image Alt property.
An image name should be contain keywords.
Keywords should be in page URL.
15) What is Google SandBox?

Google SandBox put the websites on probationary period until they prove that they can be trusted. Google SandBox is introduced to stop making invalid ranking of new websites through buying old domains, making sub domains with ranked domains etc.

16) What is WWW and NON WWW?

Search engines may view sample-site.com and www.sample-site.com as two different Website URLs. If link popularity get split among both versions www and non www, links will have less authority on each version than if the two were combined.

To restrict both versions (www and non www) getting indexed in Search Engine database, you will have to do 301 redirection to redirect one version to other version.

17) How should be the paragraph content?

Content should be easy to read.
Content should be related to the topic – it should not be large amount of text without paragraphs and subheadings.
Content should be fresh, unique should not be copied from any resources.
Content should be targeted to the user not to the Search Engines.
18) How will you optimize Anchor Tag?

Both User and Search Engine like Anchored Text, so it should be meaningful and descriptive.
Anchor text should be related to the linked page.
Writing length text (paragraph like) should be avoided as Anchor Text.
Make more internal links of the web pages with meaningful, descriptive text and related target page.
19) How will you optimize Image Tag?

Image name should be containing keywords.
Image alt property should be containing long tail keywords.
Image not should not be lengthy.
20) How many Heading - H1 tag should be in the Web Page?

There should be only one H1 tag with Unique, Fresh and relevant to the topic.

21) What is robots.txt and how it is useful for SEO?

robots.txt enables restricting crawling where it is not needed. This file has set of instruction that tell to the Search Engines to access and not to crawl any part of the website.

22) What do webmaster tools do?

The webmaster tools do following things:

Analyzing of robots.txt file.
Analyzing pages of website which are not opening and optimized.
Submit URL to crawl by Search Engine.
Remove already crawled URL from Search Engine Database.
Specifying preferred domains.
Analyzing issues with Title, Meta Description, Image Alt and Anchor Tags.
To display crawling and indexing errors.
Identifying that website is mobile friendly or not.
Submit and Analyze sitemap file.
Display total submitted indexed pages in the Search Engine Database.
23) What are the some popular Search Engines?

Google, Bing, Ask, Yahoo!, DuckDuckGo

24) What is keyword stuffing?

To use littering keywords that are not relevant to the page, and using of same keywords on the multiple pages known as keyword stuffing, to fix this issues use Unique and Meaning, Descriptive keywords for different pages.

25) What is Search Engine Submission?

Webmasters and Website owners need to submit their website and web pages to the search engines along with the keywords, soon after submission the website and web page URL a Search Engine bot will crawl submitted URL and update the URL in their indexes.

26) What is Meta Robots tag and how it works?

A Meta Robots tag is used to Page level instruction to the Search Engine Crawlers to do or do not crawl the page, this tag should be included in the HEAD tag.

Syntax:
<meta name="robots" content="NOINDEX, NOFOLLOW">
<meta name="robots" content="INDEX, FOLLOW">
Content value NOINDEX, NOFOLLOW instruct to the crawler to do not crawl and index this web page and the value INDEX, FOLLOW instruct to the crawler to crawl and index the web page.

27)What is disallow in robots.txt file?

It is an instruction to the Search Engine to prevent (restrict) accessing of specific pages or directories.

28) What is MozRank (mR)?

MozRank (mR) shows the popularity of a given web page or website, web pages with high MozRank can get good rank in the Search Engines. A MozRank can be improved by getting more and more backlinks from semi and more popular websites or web page(s).

29) What is keyword research?

Keyword research is technique to research high paying and popular keywords on the web. We do keyword research to know about the keywords monthly search, popularity to rank web pages based on the keywords.

30) What are the free and popular keyword research tools?

Google Adword Keyword Planner tool, Microsoft Bing Ads Intelligence, Wordtracker's Free Basic Keyword Demand.




1) What is Bounce Rate? What are the factors for high bounce rate? How to reduce bounce rate?

Bounce rate is the percentage of those visitors who left your website after viewing only single page. Let’s understand the bounce rate in detail with formula.

Rb=(TV/TE)

Where,
Rb : Bounce Rate
TV : Total Number of visits viewing one page only
TE : Total Entries of that page

There are many factors responsible for high bounce rate that means user leaving a particular page because of site design or usability issue. Second situation is that when user may get needed information on the landing page.

Techniques to reduce your Bounce Rate:
Make your website navigation user friendly where user can get required information easily.
Target right landing page for a relevant search term.
Optimize those pages to all co-relate search term.
Redesign the entrance pages which have high bounce rate.

2) What are the difference between HTML and XML Site-Map?

HTML site-map is generally created for better user experience purpose where user can easily navigate to all other internal pages from a single page. This site-map is also give information about website structure.

XML site map is created for search engine so that googlebot can easily crawl, index and discover new pages. XML site-map also give various option such as you can add additional information about pages within site-map like image, video, news etc. You can also add change frequency to corresponding page with options weekly, daily, monthly.

3) What is the difference between Bounce Rate and Exit Rate?

Bounce rate is the percentage of visitor who lands to your website and leave it without visiting any other page while Exit rate is the percentage of visitors leaving your website. Exit rate is calculated for individual page.

Exit Rate = No. of visit of Page with Exit/Total page views

4) What are 301 and 302 redirects? When should I use these redirection methods?

A 301 redirect is also called permanent redirect that informs the crawlers that your page content is removed permanently to the other page. 302 link redirection methods passed the link juice completely to the new page. Permanent redirection is used when domain is moved to new CMS or when the page URL structure is changed.

A 302 is commonly referred as temporary redirection. This link redirection doesn’t pass the link juice to the new page. This method informs the search engine crawler that your content is just offline temporarily. Temporary redirection is widely use in e-commerce industry for all category pages where product is sold or out of stock.

5) How to search best keyword for particular business website?

For keyword research Google keyword planner is the best option. You can find it after login Google adword account. Search Keyword Idea option will give you the relevant search term with corresponding search volume. You can also find keywords and search volume on the basis of city or country wise. Another way to research for keyword is the Google suggestion given below the search result list of pages. You can also use third party tool such as word tracker, word stream, Google trends etc.

6) What is Google Sandbox?

Google sandbox is basically an imaginary place where Google keeps new websites before giving them high ranking position in SERP. The reason to put a new website in sandbox is to check whether the website is really optimize in better way or they are doing some black hat activity to rank well in search result. A domain can be in sandbox for the time period less than one month to 8 month.

7) What is canonical Issue? How to resolve this issue?

Canonical arises when the same page access from multiple URL’s. This issue can lead to duplicate content with in the same website. For example in E-commerce industry same page can be access through various filters like color, size, price etc. To resolve this issue canonical tag is used on multiple version of the original page with the <rel canonical > tag.

For Example:
<link rel="canonical" href="http://example.com/sports-shoes"/>
8) How to target any specific country audience for your business?

For targeting the people of any country first you should have the domain with TLD (Top Level Domain) of that country. For example for targeting country such as Canada you must have a domain like http:://www.example.ca. Your website IP should also from the same region so host your website within the same country. The next important step is the geographical setting in webmaster tool. Go to search traffic section in your webmaster tools account, now choose International targeting, and under country section select the desired country audience that you want to target.

9) What is the maximum limit of URLs we can add into a single sitemap? What is Sitemap index file? When we need to create this file?

The maximum limit of URLs in a site map is 50,000 and it should not be more than 10 MB. If your sitemap contains more-than 50000 URLs than split your sitemap into multiple sitemap. A sitemap index file contains the list of all sitemaps. The maximum limit of sitemap index file is up to 1000 and must not exceed 10MB. To manage multiple sitemaps a sitemap index file is created with following tags.

            <sitemapindex>
            <sitemap>
            <loc>
            <lastmod>
        
10) What are keyword density, prominence and proximity?

Keyword Density: The keyword density refers to the percentage of the keywords contained within the total number of words on a webpage.

Keyword Prominence: Keyword Prominence refers to how prominent keyword is used within a web page. It is generally a recommendation to place the important keyword at right place such as within first paragraph on a webpage.

Keyword Proximity: It refers to the closeness between two or more keywords. For better keyword proximity decrease the distance between two important keywords.

11) What is the latest recommendation of Google for page title?

For a better page title it should not be exceed from 512 pixels because Google calculate the pixel width of the characters used in title of the web page. Google truncate the title with CSS and add ellipsis once it crosses the limit.

12) What is schema.org? How it helps you improve your ranking?

Schema.org is a library of tags where each tag has specific meaning. These tags you can add in your web page to change the representation in SERPs. You can add star ratings and reviews in your page snippet result.

13) Can we use rel=canonical tag for cross domain?

Yes you can use rel=canonical tag for cross domain also. Google announced the support of cross domain canonical tag in 2009. This type of canonical tag is generally used for syndicated content.

14) How to remove a page from Google Indexing?

To remove a web page from Google indexing you can use remove URL option from webmaster tool. Before submitting the page in webmaster tool first removes it from your website so—that it return 404 or 410.

15) How to prevent crawler from indexing a webpage?

You can request the crawler for not indexing your web page by two ways. From first method you can block that particular in robots.txt file. Second is , use NOINDEX attribute on a page with Meta Robot tag.

<META NAME="ROBOTS" CONTENT="NOINDEX">

16) What is EMD Update of Google?

EMD stands for Exact Match Domain, it is a filter by Google to prevent those poor sites to rank well who want to take advantage of commercial keywords in their domain name.

17) How to increase the page load speed?

To improve the page load time we should follow these steps:
     1. Use external CSS for web pages.
     2. Avoid large size images on your web page. Compress the image before using on web page.
     3. Remove the irrelevant code from your page.

18) List some tools that you use for your daily SEO activity?

I use following tools for my SEO work:
    1. Google Analtyics
    2. Google Webmaster Tools
    3. Open Site Explorer
    4. Alexa
    5. Ahrefs
    6. Screaming Frog
    7.  Xenu

19) What do you understand by Google Dance?

Whenever any major updates happen by Google than search engine results pages sequence are changed frequently. It continues for some days hence it is called Google Dance.

20) Is there any limit for robots.txt file?

Yes, Googlebot reads only first 500 KB of Robots.txt file.

No comments:

Post a Comment