The Wayback Machine - https://web.archive.org/web/20130425032043/http://www.searchengineoptimising.com:80/glossary/seo-glossary-of-terms/display/2/all-terms
Home arrow SEO Resources arrow Glossary
SEO Glossary of Terms
A Glossary of Terms relating to search engine optimisation (seo)
Glossary Search:

Begins with Contains Exactly matches
View Glossary
Submit Term

All


All

There are 618 entries in the glossary.
Pages: 1 2
Term Definition

Shadow domain

Shadow domain or ghost domain is the name for a domain which channels traffic to a site. A throwaway domain is needed which then will either link or redirect the traffic to the intended "real" site. Doing this is considered to be creating spam and search engiens will not rank a site found to be on a shadow domain.
 

Sidewinder

This is the name of the spider that Infoseek uses.
 

Similarity

This is a term for the measure of similiarity between a query and and the document found. It can also be used when comparing two documents.
 

Singular value decomposition

This is the term for decompiling a huge database in order to discover the relevance of documents by comparing them to other documents. This is done through stemming, local and global weighting and normalisation.
 

Siphoning

Siphoning is a technique that steals a different website's traffic, usually through using spyware or cybersquatting.
 

Site Hit

See hit for more details.
 

Site search

A site search is a search that will only look through documents contained on one particular website, unlike a search engine which will look at sites from the whole web.
 

Sitemap

A sitemap is just that, a map of a website. It contains links to every page on the site and spiders are able to quickly gain access to all of the pages on a site. Having a link to a sitemap on every page of a website is very useful.
 

Skewing

This is the term for the act of a search engine artificially changing the results of a search in order to allow documents to score highly on particular searches.
 

Slamming

This is an old and little used term for spamdexing.
 

Slashdot

Slashdot was created by Rob Malda and is community new site that is centrally edited and concentrates on technology.
 

Slurp

This is the name of Inktomi's spider.
 

Sniffer

This is the name of the program that Infoseek used in order to "sniff out" any spamdexing.
 

Snippet

This is the term for the small amounts of quoted content that some search engines will present in their search engine results pages, in place of the webmaster created descriptions. The name comes from the robots meta tag "NOSNIPPET" which turned off SNIPPETS.
 

Social media

Social media is the term given to websites that encourage the users to submit content that is valuable. Social news sites are one example.
 

Sorting results

Sorting results is the term given to the search engiens results pages rankings, the most relevant being sorted so that they are at the top and the least relevant at the bottom. However the user depending on what search engine they are using can choose to have the sorting results in a different form such as alphabetically.
 

Spam

Spam is a collective term that is used to describe every method of reaching a wide audience that is unethical. This is usually in the form of a mass emailing campaign and can streatch to millions of emails, these methods are used as they are cheap.
 

Spamdexing

Spamdexing is a set of methods and techniques that can be used I order to fool search engines and get an unfair advantage in rankings. This can undermine a search engines index as it can reduce the relevancy of searches. However search engines do try to stop all attempts of spamdexing using sfotware to detect it. Pages found to be spamdexing can be de-listed or penalised in some other way.
 

Spamming

Spamming is the creation of spam and its distribution.
 

Spider / spyder

Spiders work in the robots.txt protocol and they are used to crawl or spider the web looking for pages to include in the search engines index.
 

Spidering

This is the term for what spiders do, they will spider, or surf and index documents for a search engine.
 

Splash page

This is a web page that although it is intricatley designed and rich in features it has very little to offer in terms of content and usability. As a result search engines don't have mcuh to index, due to this it is important to have a home page with relevant content on it.
 

Splog

A slog is a spam blog. These are usually made up of very poor quality and stolen content.
 

Spoofing

Similar to spamdexing and IP spoofing.
 

Spyware

Spyware is a type of malicious software that can be installed on a computer without the owner knowing, usually from opening an email or clicking a pop up. This software then spys on what the user does and can relay this information back to a third party using the spyware. The kind of information spied on is normally things such as bank information and other personal information
 

Squidoo

Squidoo was created by Seth Godin and is a topical lens site.
 

SSI

SSI or Server Side Includes is the term used for the method used to take sections of a page from your website and put it on another. This makes wesbite updates easy, however you must have PHP programming or another language that is easy to include files in, files name that end in ,shtm extension or .shtml or change your .htaccess for .hmtl and htm files to be processed like they were .shtml files.
 

Static content

Static content is the term for content that does not change, or changes very infrequently. The advantage of having changing content is that your achive will grow and people want to see fresh content, so will return to your site. This in turn builds brand awareness equity and mindshare both of which are invaluable in online marketing.
 

Stats / statistics

When thinking of search engines, statisitcs are the information that software used for reporting will present which is contained in the log files. This information will take the form of aspects like number of searches performed, visitors, referrers etc.
 

Stealth

Stealth is a term for the collective use of techniques used to fool search engines such as cloaking, which shows optimised content to a spider but a totally different page to a visitor.
 

Stemming

Stemming is a method by which a word is linguistically reduced to its root in order to compare documents in a search query. For instance if the word "builder" was entered into a search engine it would then reduce that word to "build" and then show all the documents containing that root word such as "building" and " builders".
 

Stop character

A stop character is a character within a URL that will tell the spider to cease crawling after a certain point. These charcters are usally question marks or ampersands.
 

Stop words

These are words such as conjunctions which tend to be used so much that they are ignored by search engines and have no relevancy.
 

Sub-category

A sub-category is a low level category within a directory. Directories usually have several levels of catagories.
 

Subject-specific popularity

This is a Teoma feature that will not only rank pages by relevancy but also by how many other same topic pages have referenced them. This is a community approach to ranking and by allowing users to vote on page relevancy the results returned by a query are much more relevant.
 

Submission

This is the method of submitting a URL manually to a search engine in order for it to be spidered and indexed. This will get a website added to a search engines database.
 

Submission rules

Submission rules are the rules that a search engine will provide in order for a person to submit websites to be spidered. These rules are usually about the number of pages each day that can be submitted or the amount of times a page can be resubmitted.
 

Submission service

A submission service is a fee paying service tha twill submit your URLs to search engines, be warned though these submittion are normally low quality, so if you are going to use a submission service use one with a good reputation.
 

Submission software

Submission programs are a type of software that will help in submitting URLs to search engines. There are many variations on this software but only a few are worth using.
 

Submit

When you submit you are manually submitting a URL to a search to spider and index.
 

Substring matching

Substring matching is the method that some search engines will use whereby not only exact matches are returned in the SERPs but also some partial word matches.
 

Sullivan, Danny

Danny Sullivan created and edits SearchEngineWatch.com who also has a blog called Daggle.
 

Supplemental results

Supplemental results are those pages that have lower weighting, link popularity and are less trsuted than the ones contained in the main search index. These pages have a lower crawl frequency and are not as viewed as those that are higher ranking.
 

T-Rex

T-Rex is the name for the spider that Lycos uses.
 

Taxonomy

Taxonomy is a system of classification that normally hierarchically organises topical subjects.
 

Technorati

Technorati is a search engine for blogs that tends to be heavily weighted towards those blogs that contain popular stories and items and link relationships.
 

Telnet

Telnet is a remote computer service that is used for manipulation and script initialisation.
 

Teoma

Ask.com is powered by Teoma and it is a search engine that is topical and community based. It works by using the theory of authorities and hubs by Kleinberg.
 

Term frequency (TF)

This is the total number of times that a keyword is used within a document or group of documents.
 

Term vector database

This is a database that contains weighted documents and attempts to understand the various document topics when comparing how similar they are to other documents. These documents are then paired to the query which is based on the angle of the vector and its length.
 

Term vectors

When using Salton's model of vector-space retrieval all queries and documents are converted into term vectors. This enables the documents to matched to queries. The documents are the ranked by the total number of times the search terms appear within that document.
 

Termination page

A termination page is a page that although it can be accessed from links in the core does not link back.
 

Text link ads

These are ads which are text links instead of other ads such as banners or pop-ups. These are used as people are more tempted to click on a text link instead of an advertisement. As search engiens want to only count links that are editorial they will tend to ignore links that are grouped with paid links, especially if they are to commercial sites that are not relevant.
 

The tragedy of the commons

This is a type of story that shows that in order to get something which is good for the masses, or the common some people have to give soemthing up. In marketing terms the commons are the customers and Google were fortunate to win as they developed methods to make advertising less irritating to the commons.
 

Theme engine

A theme engine is a type of search engine that will try to use the keywords a site contains in order to classify it.
 

Thesaurus

A thesaurus is like a dictionary, however rather than give the meaning of words it will give a list of synonyms for words entered.
 

Throwaway domain

A throwaway domain is a domai nname that is of very little value to the owner. They are normally used to experiment with cloaking and other unethical SEO techniques. Some people who practise unethical SEO may even link build using throwaway domains in an attempt to improve a sites link popularity. However search engines warn against this and are always finding new methods to combat this type of activity.
 

Title

A title is used as a description of the content of a document and a web pages title is displayed in the browser window that is normally located at the top of the browser screen. A title is used by search engines in order to rate a documents relevancy in a search. This is very important and the title is in HTML in the header section.
 

TLD

TLD or Top Level Domain is a domain name that is at the top of a domain. Examples of these addresses usually end in .org, .gov, .net, .edu, .infor and .biz.
 

Toolbar

Today many large companies will freely distribute their own toolbars which can search, these tooolbars can also have other features such as pop-up blockers and auto fill. This is a very good way of gaining marketshare.
 

Top-level page

Top-level page is another name for a websites default page. When a search engine tells you to only submit you top-level page this is the page it is refering to. Once this page has been found by the spider it will be able to access the rest of sites pages. Due to this it is a good idea to link your sitemap to this page so the rest of the site can be easily found by the spider.
 

Topic-sensitive PageRank

Topic-sensitive PageRank is an alternative way of computing PageRank which will only focus on one topic and will score that instead of scoring globally.
 

Trackback

Trackback is a type of notification that is automated to tell you when your website was mentioned by another site. The majority of good blogging sites have this feature in built. However it is very easy to spam so most bloggers and publishers will turn off the feature as the signal to noise ratio is low.
 

Traffic

This is a term given for a websites activity, some people refer to visitors as traffic but it can be the number of hits or even unique visitors.
 

Traffic-death redesign

This is the term given for when a website changes all its pages and users wishing to visit the old pages will get an error 404 message. The URL of the site remains the same but none of the old pages acan be accessed.
 

TrustRank

TrustRank is a search relevancy algorithm that will put extra weight on certain links that are from major corporations and educational or governmental institutions as these are consdiered to be trusted seed websites.
 

Turbo10

Turbo10 is a meta search engine that not only searches documents that are indexed, or surface-web, but also the DeepNet, or invisble web - these are documents that are not indexed.
 

Type-less search engine

This is the name for a search engine that does not use keywords to perform searches. Instead mouse clicks or voice recognition are used instead.
 

Typepad

Typepad is blogging platform that is hosted. It is provided by SixApart and it allows users to publish content so that is seems as though it is on its own domain. Like any hosted platform it can be hard to claim back a websites trust related to age and link equity if you would like to make money from your website. If this is the case it is always best to host your own domain.
 

Unethical SEO

SEO is usually neither ethical or unethical but there are some techniques that are more frowned upon by others. This set can constantly change hwoever so when looking for an SEO expert is best to opt for someone who is trusted, rather than someone who says they are ethical.
 

Unique user

See unique visitor for details.
 

Unique visitor / s / uniques

This is the term given for one single person who visits a website. This person amy, over a length of time revist the site many, many times but the sites log file will only show this person as being one unique visitor.
 

Update

All search engines perform updates to their sets of data and algorithms in order for them to remain fresh and unique and to ensure that their relevancy algorithms are difficult to update. These updates are performed almost constantly.
 

Upload

This is the term given for the act of transfering data from a drive that is local to a server. The interent is then utilised to gain access to this data.
 

URL

URL or Universal Resource Locator or Uniform Resource Locator is the Internet address for a website. Every single website and Internet resource must have a URL in order for it to be found.
 

URL conversion

This is a way of making a dynamic site look static, this is done by removing the variable information in the URL. So signs such as & and ? Will appear as / and / in the URL. This enables the site to confrom to some of the different systems. Genrally doing this is not a problem for the server although it can present some overhead processing.
 

URL rewrite

A URL rewrite is a technique that will help to improve a sites indexing by altering a URL to make it more descriptive and unique.
 

URL submission

This is the same as submission and is the act of manually adding a URL to a search engine. This will instruct the search engines spider to index the site to a search engine.
 

URL variable

This is a term that looks at the information that along with the URL is passed to the server. The servers then uses this information within the context of the served-based program or particular script that it is running.
 

Usability

This is a term that looks at how simple it is for a visitor to a site to do the action they want to. By looking at and inproving a websites formatting and structure a person can seriously improve the websites usability and in turn improve the sites rate of conversion.
 

Usage data

This is a type of data such as the number of times visitors view a page, high traffic, high clickthroughs etc are seen as an indication of quality for a website. This data is used to test a websites quality.
 

Usenet

Usenet is a type of search that will only focus on one particular area of information, format or field.
 

Variable /URL variable

This is a term that looks at the information that along with the URL is passed to the server. The servers then uses this information within the context of the served-based program or particular script that it is running.
 

Vertical portal

See vortal for further information.
 

Vertical search

This is a type of search that concentrates on one field or one type of information or format. YouTube is an example of a vertical search engine that is based on videos.
 

Viral marketing

Viral marketing is a type of marketing that spreads itself quickly, like a virus. Emailing campaigns that reach thousands of people at the same time are an example of viral marketing, blogging can also be classed as viral marketing.
 

Virtual domain

A virtual domain is the name given for a domain that is not hosted on its own server, instead it shares one with other domains. The domain itself will be unique, however the IP address will be a shared one.
 

Virtual server

A virtual server is a server tha tis shared with many other domains. It can be a very cheap way of hosting a domain although access speeds will be slower, so if you are looking for fast access it is better to have a dedicated server to host your domain.
 

Visitor

A visitor is the term that is given to the total number of times people visit a website, not the number of people individually visiting a site once.
 

Vortal

A vortal is a portal that only focuses on one certain (vertical) subject. They are usually aimed at one group of people such as SEO experts or computer programmers.
 

Wales, Jimmy

WiKiPedia was co founded by Jimmy Wales.
 

Wayback machine

Wayback Machine is a huge archive which has taken shots of websites from years ago, thus allowing the user to see 'wayback' to a websites past.
 

Web copywriting

Web copyrighting is very much the same as regualr copyrighting but it ihas evolved to reach and online audience. Today web copyrighting is tailored to appeal to spiders and includes many keywords and other SEO techniques. However there are some people who disagree with this and think that web copyrighting should only look at converting visitors to a site customers, rather than getting the visitors there in the first place.
 

Web record

A web record is the information that a search engine will hold about web pages. When a spider crawls a site it will not index everything on website, only the content of a site will be indexed as this is what a query will be looking at. All web records are different on different search engines as all will rank relevance differently.
 

Webcrawler

This is a meat search engine that is quite old.
 

Weighting

Weighting is a term search engines use when deciding the relevance of a document when performing a search. The more a document weighs - i.e. the more keyowrds the document contains, the more relevant it is .
 

White hat SEO

All search engines will have a set of non static guidelines by which they can make money through advertising revenue. Within this there are types of marketing that are seen as unethical or decpetive and these are termed as Black Hat SEO. If you stay within the guidelines you are considered to use White Hat SEO methods. However it is propbably better to trust someone who has extensive knowlege of algorithms or you trust rather than someone who says they are white hat SEO.
 

WHOIS

WHOIS is a kind of search, the difference being that it uses domain names for the query and the results page will show the domain's details. These include things such as the date of registration and the owner of the domain.
 

Wiki

Wiki is publishing software that enables information to be put online using editing that can be collaborative.
 

Wikipedia

This is a free online encylopedia that allows people collaborate to fill it with information.
 

Wisenut

Wisenut was a search engine that for some time looked as though it would be as credible as Google, however it did not achieve this and closed down on 28 September 2007.
 

Word stuffing

Word stuffing is another term for keyword stuffing.
 

Wordnet

Wordnet is a databse of the English language that is lexical and it assists search engines to understand the relationship between words.
 

Wordpress

Wordpress is a type of software platform that is mainly used for blogging. It is open source, so anyone can publish and be hosted on it. If you are wanting your site to make you money or to build a brand it is better to host on your own domain as it can be hard to claim a sites age related trust and vital link equity if it is built on a domain that is not your own.
 

Wordtracker

This is one of the paid keyword research tools. It tends to only use a couple of search engines such as dogpile.
 

Xenu

Xenu is one of the most popular Internet based program that is used for checking the validity of links.
 

Xenu link sleuth

This is a free software program that allows a website owner to check for broken links both internally and extenally. It also aids the creation of sitemaps.
 

XHTML

XHTML or Extensible Markeup Language is a set of specifications that were created in order to move HTML to the formatting of XML.
 

XML

XML (Extensible Markup Language) is a type of lanuage used in programming that allows the author to customise their tags. It is often used in web based applications as it is so use friendly for the programmer.
 

Y!

Y! is the shortened term for Yahoo!
 

Yahoo!

Yahoo is a web based directory that is one of the most popular on the Internet
 

Yahoo! Answers

This is a Yahoo! Owned free Q&A; site that allows people to both ask and answer questions and it results in free content.
 

Yahoo! Directory

This is one of the oldest and most trusted directories. For a fee of $299 per year a site can be in the Yahoo! Directory and gain trusted links. This is almost invaluable to a site that is legitimate and wants to gain more traffic and trust.
 

Yahoo! Search marketing

This is a paid search platform and is owned by Yahoo! It was once called Overture.
 

Yahoo! Site explorer

This is a tool that is used by webmasters to research which website pages Yahoo! has indexed and creates a list of the links to and from those pages.
 

YouTube

YouTube is owned by Google and it is a site on which members of the public can upload their own videos for other people to watch.
 

Zeal

Zeal was a directory that was non-commercial. It was bought out by Looksmart, then it suddenly ceased operating.
 

Zone

Zone is another term for the shortening of a topic within a search engine to allow the person searching to only be given results within defined perameters, such as date range or area of the world the site originated from.
 


All