Papers by Zuze, H
Page 1. Cape Peninsula University of Technology Digital Knowledge CPUT Theses & Dissertat... more Page 1. Cape Peninsula University of Technology Digital Knowledge CPUT Theses & Dissertations Theses & Dissertations 9-16-2011 The crossover point between keyword rich website text and spamdexing Herbert Zuze Cape Peninsula University of Technology ...
Bookmarks Related papers MentionsView impact
""The quality of webpage content has an effect on how search engine crawlers index it. ... more ""The quality of webpage content has an effect on how search engine crawlers index it. Similarly, the degree of interlinking between websites with a similar focus plays a large role in how search engines perceive them. As a result, link building has become a popular technique in search engine optimisation as it provides a positive impetus to the value search engines attach to a website. Link wheels are identified as one method of significantly improving this ranking of commercial websites. However, some search engine optimisation practitioners maintain that link wheels should not be implemented, since the effort required to create them does not justify the rewards. Five similar test websites will be developed as the basis of the research. Two of them will be based on white hat and another two on black hat techniques. These websites will use either a link wheel or other industry related link building strategies or methodologies other than link wheel. The fifth website will be implemented as a control mechanism without link building or a link wheel structure, and its design will be based on white hat techniques. In parallel with this experimentation, interviews will be conducted to determine how practitioners incorporate link wheels as a link building strategy. It is expected that the link wheel-based website will yield better rankings and a fair comparison in terms of return on investment. The results will be relevant to and usable by academics, industry specialists and online marketers - to enhance the quality and quantity of commercial website links, hence achieving a higher search engine ranking. Go to www.web-visibility.co.za for a copy of the image of this poster. Click on "Website Visibility Publication Library", register as a user, etc.""
Bookmarks Related papers MentionsView impact
A user can also submit a website manually for indexing. Crawlers visit a website, record all the ... more A user can also submit a website manually for indexing. Crawlers visit a website, record all the words on the pages, and note links to other sites. They track links from one page to another and indexes everything they come across on their way. Nevertheless, they do not see images, ...
Bookmarks Related papers MentionsView impact
"Zuze, H. & Weideman, M. 2011. Cloaking on Google’s SERP – search engine spamdexing? Pos... more "Zuze, H. & Weideman, M. 2011. Cloaking on Google’s SERP – search engine spamdexing? Poster presentation in: Proceedings of The Thirteenth World Wide Web conference, Johannesburg, SA. www.zaw3.co.za. 13-16 September. Whilst the Internet is growing exponentially and acting as the repository of an enormous amount of information, the content quality is also being compromised. Content relevancy is continually being decreased by an increasing number of unethical website designers who are implementing various black hat techniques to trick the search engine algorithm in order to obtain a higher ranking. Research was conducted with the aim to establish the how and when of search engine algorithms’ attempts to curb spamdexing. Nevertheless, cloaking, another form of spamdexing, was identified as existing on the search engine result page (SERP) of Google, regardless of the practice having been denounced by the search engine. Five similar websites with varying keyword densities were designed and submitted to Google, Yahoo! and Bing. After 16 days of the first experiment, three of these websites were cloaked by an Iranian site. The cloaking lasted for 10 days for the first, 11 days for the second and 39 days for the third website. A phase 2 experiment was conducted and the third website was not indexed by Google for both phases of the experiments. This was the case in spite of the fact that it had a high keyword density, as supported by scholars. Also, the Iranian webmasters might have opted to scrap the site. This could be done in expectation of a higher number of visitors to the site looking for laptops rather than the actual information that was included on their site. None of the websites submitted to Yahoo! and Bing were cloaked. This research provided evidence that the search engine algorithms are still failing to fully address these practices and some developers are implementing cloaking without being identified by the search engines. The research also established that the waiting time for indexing can be prolonged by such practises and this may result in some websites not being indexed at all. Go to www.web-visibility.co.za for a copy of the full text of this poster. Click on "Digital Library", register as a user, etc."
Bookmarks Related papers MentionsView impact
Thesis Chapters by Zuze, H
With over a billion Internet users surfing the Web daily in search of information, buying, sellin... more With over a billion Internet users surfing the Web daily in search of information, buying, selling and accessing social networks, marketers focus intensively on developing websites that are appealing to both the searchers and the search engines. Millions of webpages are submitted each day for indexing to search engines. The success of a search engine lies in its ability to provide accurate search results. Search engines’ algorithms constantly evaluate websites and webpages that could violate their respective policies. For this reason some websites and webpages are subsequently blacklisted from their index. Websites are increasingly being utilised as marketing tools, which result in major competition amongst websites. Website developers strive to develop websites of high quality, which are unique and content rich as this will assist them in obtaining a high ranking from search
engines. By focusing on websites of a high standard, website developers utilise search
engine optimisation (SEO) strategies to earn a high search engine ranking. From time to time SEO practitioners abuse SEO techniques in order to trick the search engine algorithms, but the algorithms are programmed to identify and flag these techniques as spamdexing. Search engines do not clearly explain how they interpret keyword stuffing (one form of spamdexing) in a webpage. However, they regard spamdexing in many different ways and do not provide enough detail to clarify what crawlers take into consideration when interpreting the spamdexing status of a website. Furthermore, search engines differ in the way that they interpret spamdexing, but offer no clear quantitative evidence for the crossover point of keyword dense website text to spamdexing. Scholars have indicated different views in respect of spamdexing, characterised by different keyword density measurements in the
body text of a webpage. This raised several fundamental questions that form the basis of this research. This research was carried out using triangulation in order to determine how the scholars, search engines and SEO practitioners interpret spamdexing. Five websites with varying keyword densities were designed and submitted to Google, Yahoo! and Bing. Two phases of the experiment were done and the results were recorded. During both phases almost all of the webpages, including the one with a 97.3% keyword density, were indexed. The aforementioned enabled this research to conclusively disregard the keyword stuffing issue, blacklisting and any form of penalisation. Designers are urged to rather concentrate on usability and good values behind building a website. The research explored the fundamental contribution of keywords to webpage indexing and visibility. Keywords used with or without an optimum level of measurement of richness and poorness result in website ranking and indexing. However, the focus should be on the way in which the end user would interpret the content displayed, rather than how the search engine would react towards the content. Furthermore, spamdexing is likely to scare away potential clients and end users instead of embracing them, which is why the time spent on spamdexing should rather be used to produce quality content.
Bookmarks Related papers MentionsView impact
Uploads
Papers by Zuze, H
Thesis Chapters by Zuze, H
engines. By focusing on websites of a high standard, website developers utilise search
engine optimisation (SEO) strategies to earn a high search engine ranking. From time to time SEO practitioners abuse SEO techniques in order to trick the search engine algorithms, but the algorithms are programmed to identify and flag these techniques as spamdexing. Search engines do not clearly explain how they interpret keyword stuffing (one form of spamdexing) in a webpage. However, they regard spamdexing in many different ways and do not provide enough detail to clarify what crawlers take into consideration when interpreting the spamdexing status of a website. Furthermore, search engines differ in the way that they interpret spamdexing, but offer no clear quantitative evidence for the crossover point of keyword dense website text to spamdexing. Scholars have indicated different views in respect of spamdexing, characterised by different keyword density measurements in the
body text of a webpage. This raised several fundamental questions that form the basis of this research. This research was carried out using triangulation in order to determine how the scholars, search engines and SEO practitioners interpret spamdexing. Five websites with varying keyword densities were designed and submitted to Google, Yahoo! and Bing. Two phases of the experiment were done and the results were recorded. During both phases almost all of the webpages, including the one with a 97.3% keyword density, were indexed. The aforementioned enabled this research to conclusively disregard the keyword stuffing issue, blacklisting and any form of penalisation. Designers are urged to rather concentrate on usability and good values behind building a website. The research explored the fundamental contribution of keywords to webpage indexing and visibility. Keywords used with or without an optimum level of measurement of richness and poorness result in website ranking and indexing. However, the focus should be on the way in which the end user would interpret the content displayed, rather than how the search engine would react towards the content. Furthermore, spamdexing is likely to scare away potential clients and end users instead of embracing them, which is why the time spent on spamdexing should rather be used to produce quality content.
engines. By focusing on websites of a high standard, website developers utilise search
engine optimisation (SEO) strategies to earn a high search engine ranking. From time to time SEO practitioners abuse SEO techniques in order to trick the search engine algorithms, but the algorithms are programmed to identify and flag these techniques as spamdexing. Search engines do not clearly explain how they interpret keyword stuffing (one form of spamdexing) in a webpage. However, they regard spamdexing in many different ways and do not provide enough detail to clarify what crawlers take into consideration when interpreting the spamdexing status of a website. Furthermore, search engines differ in the way that they interpret spamdexing, but offer no clear quantitative evidence for the crossover point of keyword dense website text to spamdexing. Scholars have indicated different views in respect of spamdexing, characterised by different keyword density measurements in the
body text of a webpage. This raised several fundamental questions that form the basis of this research. This research was carried out using triangulation in order to determine how the scholars, search engines and SEO practitioners interpret spamdexing. Five websites with varying keyword densities were designed and submitted to Google, Yahoo! and Bing. Two phases of the experiment were done and the results were recorded. During both phases almost all of the webpages, including the one with a 97.3% keyword density, were indexed. The aforementioned enabled this research to conclusively disregard the keyword stuffing issue, blacklisting and any form of penalisation. Designers are urged to rather concentrate on usability and good values behind building a website. The research explored the fundamental contribution of keywords to webpage indexing and visibility. Keywords used with or without an optimum level of measurement of richness and poorness result in website ranking and indexing. However, the focus should be on the way in which the end user would interpret the content displayed, rather than how the search engine would react towards the content. Furthermore, spamdexing is likely to scare away potential clients and end users instead of embracing them, which is why the time spent on spamdexing should rather be used to produce quality content.