Reviewing the New Technique to Humanize a Search:Google Hummingbird

Pooja Balhara

Abstract


Whenever a user wants to search anything, a search engine provides him with the results based on the keywords written by him in the query. So, there are various techniques which can act as a search algorithm to be used for answering the user’s query. Google Hummingbird is one such technology. Previously a search query weighed all words equally, but Hummingbird is smarter to figure out a user’s true intent. Instead of asking what keywords users will be searching for, Hummingbird asks what kind of information are users looking for.

 

Keywords: Search Engine, Search algorithm, SEO, Crawler, SERP, knowledge graph.

 

I. Introduction

 

When a user searches some content on the search engine, results are displayed. These results are displayed as per the particular website’s contents. So, now a days the organizations are focusing on the content of their website so that when a user search anything on the search engine related to their website, their website should be on top of the result list. To make it effect, one such process is Search Engine Optimization.

 

Search engine optimization (SEO) is the process of affecting the visibility of a website or a web page in a search engine's "natural" or un-paid search results. In general, the more frequently a site appears in the search results list, the more visitors it will receive from the search engine's users. SEO may target different kinds of search, including image search, local search, video search, academic search, news search and industry-specific vertical search engines.

 

 

SEO considers how search engines work, what people search for, the actual search terms or keywords typed into search engines and which search engines are preferred by their targeted audience.

Optimizing a website may involve editing its content, HTML and associated coding to both increase its relevance to specific keywords and to remove barriers to the indexing activities of search engines. Promoting a site to increase the number of back links, or inbound links, is another SEO tactic.

 

Webmasters and content providers began optimizing sites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters needed to do was to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed. The process involves a search engine spider downloading a page and storing it on the search engine's own server, where a second program, known as an indexer, extracts various information about the page, such as the words it contains and where these are located, as well as any weight for specific words, and all links the page contains, which are then placed into a scheduler for crawling at a later date. Site owners started to recognize the value of having their sites highly ranked and visible in search engine results, creating an opportunity for both white hat and black hat SEO practitioners. Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag, or index files in engines like ALIWEB. Meta tags provide a guide to each page's content.

 


Full Text:

PDF




Copyright (c) 2018 Edupedia Publications Pvt Ltd

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

 

All published Articles are Open Access at  https://journals.pen2print.org/index.php/ijr/ 


Paper submission: ijr@pen2print.org