Jump to navigation Jump to search «SEO» redirects here. As an Internet marketing strategy, SEO considers how search engines work, the seo оптимизация сайта на dle programmed algorithms which dictate search engine behavior, what people search for, the actual search terms or keywords typed into search engines, and which search engines are preferred by their targeted audience. Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web.
Website owners recognized the value of a high ranking and visibility in search engine results, creating an opportunity for both white hat and black hat SEO practitioners. Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page’s content. By relying so much on factors such as keyword density which were exclusively within a webmaster’s control, early search engines suffered from abuse and ranking manipulation. Companies that employ overly aggressive techniques can get their client websites banned from the search results.
In 2005, the Wall Street Journal reported on a company, Traffic Power, which allegedly used high-risk techniques and failed to disclose those risks to its clients. Some search engines have also reached out to the SEO industry, and are frequent sponsors and guests at SEO conferences, webchats, and seminars. Major search engines provide information and guidelines to help with website optimization. In 1998, two graduate students at Stanford University, Larry Page and Sergey Brin, developed «Backrub», a search engine that relied on a mathematical algorithm to rate the prominence of web pages. Page and Brin founded Google in 1998. Google attracted a loyal following among the growing number of Internet users, who liked its simple design. By 2004, search engines had incorporated a wide range of undisclosed factors in their ranking algorithms to reduce the impact of link manipulation.
In June 2007, The New York Times’ Saul Hansell stated Google ranks sites using more than 200 different signals. In December 2009, Google announced it would be using the web search history of all its users in order to populate search results. On June 8, 2010 a new web indexing system called Google Caffeine was announced. In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However Google implemented a new system which punishes sites whose content is not unique. Search engines use complex mathematical algorithms to guess which websites a user seeks.