What is 2 Step Flow Theory ? UGC NET Question

 The Concept of Step Flow theory was introduced by sociologist Paul Lazarsfeld in 1948 in the book called  "The people’s choice: How the voter makes up his mind in a presidential campaign. 

The theory includes 2 steps i.e it flows from media to certain individuals which are known as opinion leaders and from there it goes to the public.

The objective of the theory - 

The objective behind the story was that to find the effect of the presidential election campaign and people's voting process. The researchers were want to know whether the messages from mass media had any effects on people voting's decision. But, unfortunately, they found that there is very less effect. Based on the outcome, the 2 Flow theory of mass communication was developed.

Important Factors in 2 Flow Theory

  • Mass Media( Newspaper and Radio)
  • Opinion Leader
  • Audience or Public
  • Opinion Leader

    An opinion leader is a leader for a particular group. He interpreted the information received from mass media and lead to their follower. The leader may add their interpretation from the message received and then forward it to the audience.

    Note- Opinion leader is a leader for a specific group of people who liken him but not for the entire populcation

    Example - During Election political is an opinion leader for few groups of people who follow him but for not who does not follow or admire him

    Criticism


    • Many researchers found evidence that initial mass media information directly flows to the general population and it does not depend on opinion leaders.

    • The flow of information does not necessarily involve only 2 steps. It can be more than 2.

    • Two flow communication theory is outdated as it was formed in the era when television and the internet did not exist and people rely on print media like newspaper and radio which were not accessible by all. so, opinion leader was the medium to communicated messages. But, nowadays people have television and they form their opinions on basis of messages provided by mass media but less on basis of opinion leaders.

    Techniques of SEO


    White Hat SEO

    When we follow Google or Search Engine protocols in order to rank website in SERP, it is known as White Hat SEO.

    Example :- Meta Description ,Keyword Density, Keyword in URL, Unique Content.

    Black Hat SEO

    When we go against Google protocols and follow wrong practices in order to rank website in Google Search Results, it is known as Black Hat SEO.

    Example :- Link Spamming , Plagiarized or Copied Content


    Gray Hat SEO

    Gray Hat SEO are those practices which are not defined in books of Google or Search Engine. Google neither support neither disagree with this practices. As a result Google neither boost your rank on Google search Results not will decrease your website rank on Google Search Results.

    Example :- Google Adsense, Google Adwords or Google Ads

    Sitemap

    What is Sitemap?


    Sitemap is basically list of all WebPages of a website. It works like a map for your website which shows what all pages are present on your website. It is a file which helps Google and other search Engine like Bing, Yahoo, Baidu to crawl all your pages of your website in one go and index them easily.


       Types of Sitemap

    There are 2 types of sitemap


    ·       XML Sitemap


    XML Sitemap is Search Engine friendly. It is basically for Robots or Crawlers of search Engine to crawl and index each pages of website easily.XML sitemap is Search Engine Friendly


    ·       HTML Sitemap


    HTML sitemap easily allows website visitor to navigate website. Its helps humans or website visitors to find any page of website easily.
    Note :- Only use one sitemap format XML or HTML.

    How to check whether a website contain Sitemap or not ?

    Write the domainname/sitemap.xml or websitename/sitemap.xml in Url bar to check whether sitemap is there or not in the website.

    How to generate XML and HTML Sitemap?     

    •  A Website will open, write the url of the website whom sitemap              you want to           generate.

    •      Sitemap will be Generated
    •     Download the sitemap file

                                 
    •             Send the file to Developer, he will paste the code in Website.

    How to Submit Sitemap in Google Webmaster Tool/ Search Console?

    Note: - Google Webmaster tool and Search Console is one and the same thing

    •      Open Google Webmaster tool or Search Console
    •     In the left site you will see Sitemap Option, Click on it.
    •    Write sitemap.xml and Submit it.
    •         Your Sitemap will be submitted in Google Search Console.



                     


    Robots.txt

    What is Robots.txt

    Robots.txt is a text file which tells robots which pages to crawl which pages to not  in website.Example your website has a page www.example,com/abc and you want this page to be be crawl by any search Engine than you can use robots,txt command to tells robots to not to index this page of website.

                                    How to Create Robots.txt file ?





    • Your Robots.txt file will be Generate




    • Give this code to developer , he will upload this file in Website 

    How to check whether your Website has Robots.txt file or not ?


    Write websitename/robots.txt in url bar

    If this type of File will be generated, it means your website has Robots.txt file/







    Google Penguin Update











    Launched Date: - 24 April 2012

    The objective of Penguin updates is to decrease ranking of those sites who are involved in link spamming and keyword stuffing or black-hat practices.

    Issues Addressed in Penguin Update

    ·       Link Spamming

    It is black hat practice which has the motive of manipulating the Page Rank or ranking of a site in Google's search results. It is basically posting of irrelevant website links on blogs, forums, website in order to increase rank of website on Google Search Results.

     Examples of Link Spamming

    ·     Buying and Selling of links that has Good Domain and Page Authority.  It consists of the exchange   of money for good links.

    ·      Excessive link exchanges i.e A links to website B and b links to Website A.
    ·      Making Web pages which consist of keywords only and less relevant content.

    Use automated programs or services to link to your site.

           Keyword Stuffing 

    Keyword stuffing is a practice of putting so many keywords in web pages of websites in order to rank website easily on Google Search Results.

    How to Recover from Penalty

    ·      Maintain Keyword Density and prevent Keyword Stuffing in Content of Web Pages.

    ·     Find bad Links through tools likes Dr Link Check and remove bad links or 404 Errors from   website.





    Google Panda Update





    Panda/ Farmer Update


    Released Date: - 23 February 2011

    The name Panda was originated from name of Google Engineer Navneet Panda.The objective of this update was to lower the rank of Websites with low and thin or poor quality content and increase rank of website which has good quality and relevant content on Google Search Results.

    Latest Panda Updates

    Panda 4.0

    Released Date - May 19, 2014

    It was  announced it an  important update on May 20, although the data suggests that the update actually began to be implemented on May 19. It focused on the overall content and the fine content. Sites such as ask.com and ebay.com have been very successful. Overall, the update affected  about 7.5% of search queries in English.

    Panda 4.1

    Released Date- 23 September 2014

    This update targeted affiliate sites without useful information or thin content, with too many affiliate links and search results with broken links or 404 links, the estimated impact was 3% to 5%. % of search queries.

    Issues Focused in Panda Update

    Poor Content 

    Web Pages which does not contain any relevant information which are useful for humans are called as poor quality content Web Pages. Poor Quality pages with very little text and relevant resources or names, such as a set of pages describing a variety of Beauty tips but with low content in it.

         Content Farms 
         It is the process of hiring large number of freelancer content writers to write high volume of Content with lot of keywords in order to rank Website Easily on SERP or Google Search Result, but the relevancy of content is Very Low.

        Plagiarized Content 


        Plagiarize Content or Duplicate content.Those content which are copied from other websites. Copied Content can also be find in your Website when your websites have multiple pages but with same content. For example:- Your Website has 3 pages for Digital Marketing and target different location such as Noida, Delhi, Mumbai and More than 80% content of these pages are copied are plagiarize. Google does not support copied content as its per against their policies and decrease rank of websites which has copied contents.