Hypernymy and hyponymy are the semantic relations between a generic term (hypernym) and a specific instance of it (hyponym). The hypernym is also called a supertype, umbrella term, or blanket term. [1] [2] [3] [4] The hyponym is a subtype of the hypernym. The semantic field of the hyponym is included within that of the hypernym. [5] For example, pigeon, crow, and hen are all hyponyms of bird and animal; bird and animal are both hypernyms of pigeon, crow, and hen. [6]
In linguistics, semantics, general semantics, and ontologies, hyponymy (from Ancient Greek ὑπό (hupó) 'under',and ὄνυμα (ónuma) 'name') shows the relationship between a generic term (hypernym) and a specific instance of it (hyponym). A hyponym is a word or phrase whose semantic field is more specific than its hypernym. The semantic field of a hypernym, also known as a superordinate, is broader than that of a hyponym. An approach to the relationship between hyponyms and hypernyms is to view a hypernym as consisting of hyponyms. This, however, becomes more difficult with abstract words such as imagine, understand and knowledge. While hyponyms are typically used to refer to nouns, it can also be used on other parts of speech. Like nouns, hypernyms in verbs are words that refer to a broad category of actions. For example, verbs such as stare, gaze, view and peer can also be considered hyponyms of the verb look, which is their hypernym.
The meaning relation between hyponyms and hypernyms applies to lexical items of the same word class (that is, part of speech), and holds between senses rather than words. For instance, the word screwdriver used in the previous example refers to the screwdriver tool, and not to the screwdriver drink.
Hypernymy and hyponymy are converse relations. If X is a kind of Y, then X is a hyponym of Y and Y is a hypernym of X. [7] Hyponymy is a transitive relation: if X is a hyponym of Y, and Y is a hyponym of Z, then X is a hyponym of Z. [8] For example, violet is a hyponym of purple and purple is a hyponym of color ; therefore violet is a hyponym of color. A word can be both a hypernym and a hyponym: for example purple is a hyponym of color but itself is a hypernym of the broad spectrum of shades of purple between the range of crimson and violet.
The hierarchical structure of semantic fields can be seen in hyponymy. [9] They could be observed from top to bottom, where the higher level is more general and the lower level is more specific. [9] For example, living things will be the highest level followed by plants and animals, and the lowest level may comprise dog, cat and wolf. [9]
Under the relations of hyponymy and incompatibility, taxonomic hierarchical structures too can be formed. It consists of two relations; the first one being exemplified in "An X is a Y" (simple hyponymy) while the second relation is "An X is a kind/type of Y". The second relation is said to be more discriminating and can be classified more specifically under the concept of taxonomy. [10]
If the hypernym Z consists of hyponyms X and Y, then X and Y are identified as co-hyponyms (cohyponyms), also known as coordinate terms. Co-hyponyms are labelled as such when separate hyponyms share the same hypernym but are not hyponyms of one another, unless they happen to be synonymous. [7] For example, screwdriver, scissors, knife, and hammer are all co-hyponyms of one another and hyponyms of tool, but not hyponyms of one another: *"A hammer is a type of knife" is false.
Co-hyponyms are often but not always related to one another by the relation of incompatibility. For example, apple, peach and plum are co-hyponyms of fruit. However, an apple is not a peach, which is also not a plum. Thus, they are incompatible. Nevertheless, co-hyponyms are not necessarily incompatible in all senses. A queen and mother are both hyponyms of woman but there is nothing preventing the queen from being a mother. [11] This shows that compatibility may be relevant.
A word is an autohyponym if it is used for both a hypernym and its hyponym: [12] it has a stricter sense that is entirely a subset of a broader sense. For example, the word dog describes both the species Canis familiaris and male individuals of Canis familiaris, so it is possible to say "That dog isn't a dog, it's a bitch" ("That hypernym Z isn't a hyponym Z, it's a hyponym Y"). The term "autohyponym" was coined by linguist Laurence R. Horn in a 1984 paper, Ambiguity, negation, and the London School of Parsimony. Linguist Ruth Kempson had already observed that if there are hyponyms for one part of a set but not another, the hypernym can complement the existing hyponym by being used for the remaining part. For example, fingers describe all digits on a hand, but the existence of the word thumb for the first finger means that fingers can also be used for "non-thumb digits on a hand". [13] Autohyponymy is also called "vertical polysemy". [lower-alpha 1] [14]
Horn called this "licensed polysemy", but found that autohyponyms also formed even when there is no other hyponym. Yankee is autohyponymous because it is a hyponym (native of New England) and its hypernym (native of the United States), even though there is no other hyponym of Yankee (as native of the United States) that means "not a native of New England". [lower-alpha 2] [13] Similarly, the verb to drink (a beverage) is a hypernym for to drink (an alcoholic beverage). [13]
In some cases, autohyponyms duplicate existing, distinct hyponyms. The hypernym "smell" (to emit any smell) has a hyponym "stink" (to emit a bad smell), but is autohyponymous because "smell" can also mean "to emit a bad smell", even though there is no "to emit a smell that isn't bad" hyponym. [13]
Hyperonym and hypernym mean the same thing, with both in use by linguists. The form hypernym interprets the -o- of hyponym as a part of hypo, such as in hypertension and hypotension. However, etymologically the -o- is part of the Greek stem ónoma. In other combinations with this stem, e.g. synonym, it is never elided. Therefore, hyperonym is etymologically more faithful than hypernym. [15] Hyperonymy is used, for instance, by John Lyons, who does not mention hypernymy and prefers superordination. [16] The nominalization hyperonymy is rarely used, because the neutral term to refer to the relationship is hyponymy.
Computer science often terms this relationship an "is-a" relationship. For example, the phrase "Red is-a color" can be used to describe the hyponymic relationship between red and color.
Hyponymy is the most frequently encoded relation among synsets used in lexical databases such as WordNet. These semantic relations can also be used to compare semantic similarity by judging the distance between two synsets and to analyse anaphora.
As a hypernym can be understood as a more general word than its hyponym, the relation is used in semantic compression by generalization to reduce a level of specialization.
The notion of hyponymy is particularly relevant to language translation, as hyponyms are very common across languages. For example, in Japanese the word for older brother is ani (兄), and the word for younger brother is otōto (弟). An English-to-Japanese translator presented with a phrase containing the English word brother would have to choose which Japanese word equivalent to use. This would be difficult, because abstract information (such as the speakers' relative ages) is often not available during machine translation.
A semantic network, or frame network is a knowledge base that represents semantic relations between concepts in a network. This is often used as a form of knowledge representation. It is a directed or undirected graph consisting of vertices, which represent concepts, and edges, which represent semantic relations between concepts, mapping or connecting semantic fields. A semantic network may be instantiated as, for example, a graph database or a concept map. Typical standardized semantic networks are expressed as semantic triples.
WordNet is a lexical database of semantic relations between words that links words into semantic relations including synonyms, hyponyms, and meronyms. The synonyms are grouped into synsets with short definitions and usage examples. It can thus be seen as a combination and extension of a dictionary and thesaurus. While it is accessible to human users via a web browser, its primary use is in automatic text analysis and artificial intelligence applications. It was first created in the English language and the English WordNet database and software tools have been released under a BSD style license and are freely available for download from that WordNet website. There are now WordNets in more than 200 languages.
Semantic properties or meaning properties are those aspects of a linguistic unit, such as a morpheme, word, or sentence, that contribute to the meaning of that unit. Basic semantic properties include being meaningful or meaningless – for example, whether a given word is part of a language's lexicon with a generally understood meaning; polysemy, having multiple, typically related, meanings; ambiguity, having meanings which aren't necessarily related; and anomaly, where the elements of a unit are semantically incompatible with each other, although possibly grammatically sound. Beyond the expression itself, there are higher-level semantic relations that describe the relationship between units: these include synonymy, antonymy, and hyponymy.
A synonym is a word, morpheme, or phrase that means exactly or nearly the same as another word, morpheme, or phrase in a given language. For example, in the English language, the words begin, start, commence, and initiate are all synonyms of one another: they are synonymous. The standard test for synonymy is substitution: one form can be replaced by another in a sentence without changing its meaning. Words may often be synonymous in only one particular sense: for example, long and extended in the context long time or extended time are synonymous, but long cannot be used in the phrase extended family. Synonyms with exactly the same meaning share a seme or denotational sememe, whereas those with inexactly similar meanings share a broader denotational or connotational sememe and thus overlap within a semantic field. The former are sometimes called cognitive synonyms and the latter, near-synonyms, plesionyms or poecilonyms.
In lexical semantics, opposites are words lying in an inherently incompatible binary relationship. For example, something that is male entails that it is not female. It is referred to as a 'binary' relationship because there are two members in a set of opposites. The relationship between opposites is known as opposition. A member of a pair of opposites can generally be determined by the question What is the opposite of X ?
Linguistics is the scientific study of human language. Someone who engages in this study is called a linguist. See also the Outline of linguistics, the List of phonetics topics, the List of linguists, and the List of cognitive science topics. Articles related to linguistics include:
Polysemy is the capacity for a sign to have multiple related meanings. For example, a word can have several word senses. Polysemy is distinct from monosemy, where a word has a single meaning.
Lexical semantics, as a subfield of linguistic semantics, is the study of word meanings. It includes the study of how words structure their meaning, how they act in grammar and compositionality, and the relationships between the distinct senses and uses of a word.
In database design, object-oriented programming and design, has-a is a composition relationship where one object "belongs to" another object, and behaves according to the rules of ownership. In simple words, has-a relationship in an object is called a member field of an object. Multiple has-a relationships will combine to form a possessive hierarchy.
Semantic change is a form of language change regarding the evolution of word usage—usually to the point that the modern meaning is radically different from the original usage. In diachronic linguistics, semantic change is a change in one of the meanings of a word. Every word has a variety of senses and connotations, which can be added, removed, or altered over time, often to the extent that cognates across space and time have very different meanings. The study of semantic change can be seen as part of etymology, onomasiology, semasiology, and semantics.
In linguistics, a word sense is one of the meanings of a word. For example, a dictionary may have over 50 different senses of the word "play", each of these having a different meaning based on the context of the word's usage in a sentence, as follows:
We went to see the playRomeo and Juliet at the theater.
The coach devised a great play that put the visiting team on the defensive.
The children went out to play in the park.
In linguistics, semantic analysis is the process of relating syntactic structures, from the levels of words, phrases, clauses, sentences and paragraphs to the level of the writing as a whole, to their language-independent meanings. It also involves removing features specific to particular linguistic and cultural contexts, to the extent that such a project is possible. The elements of idiom and figurative speech, being cultural, are often also converted into relatively invariant meanings in semantic analysis. Semantics, although related to pragmatics, is distinct in that the former deals with word or sentence choice in any given context, while pragmatics considers the unique or particular meaning derived from context or tone. To reiterate in different terms, semantics is about universally coded meaning, and pragmatics, the meaning encoded in words that is then interpreted by an audience.
A semantic lexicon is a digital dictionary of words labeled with semantic classes so associations can be drawn between words that have not previously been encountered. Semantic lexicons are built upon semantic networks, which represent the semantic relations between words. The difference between a semantic lexicon and a semantic network is that a semantic lexicon has definitions for each word, or a "gloss".
In linguistics, troponymy is the presence of a 'manner' relation between two lexemes.
In linguistics, a semantic field is a lexical set of words grouped semantically that refers to a specific subject. The term is also used in anthropology, computational semiotics, and technical exegesis.
Contemporary ontologies share many structural similarities, regardless of the ontology language in which they are expressed. Most ontologies describe individuals (instances), classes (concepts), attributes, and relations.
In natural language processing, semantic compression is a process of compacting a lexicon used to build a textual document by reducing language heterogeneity, while maintaining text semantics. As a result, the same ideas can be represented using a smaller set of words.
Taxonomy is the practice and science of categorization or classification.
The following outline is provided as an overview of and topical guide to natural-language processing:
Automatic taxonomy construction (ATC) is the use of software programs to generate taxonomical classifications from a body of texts called a corpus. ATC is a branch of natural language processing, which in turn is a branch of artificial intelligence.
Umbrella term is also called a hypernym
Hypernym can also be called an "Umbrella term"
umbrealla term, or hypernym
Synaptic plasticity is a hypernym (umbrella term)