A traditional criticism of term logic is on its limited expressive power. NAL solves this problem by introducing multiple copulas and compound terms, layer by layer. The ideas covered in these three layers mainly come from set theory.
In IL-2, similarity means perfect (mutual) substitutability.
The instance, property, and instance-property copula marks the end of transitivity in one or both directions in an inheritance chain, and represent "individual" and "feature".
The instance and property copulas correspond to two ways to specify a set. A term can be a set, but not necessarily so.
Valid syllogistic rules of NAL-2 include resemblance, analogy, and comparison, and they are variants of the syllogistic rules of NAL-1.
The inference rules of IL-3 come from the definitions of the related compound term, and may take one or two premises. The conclusion may contain new terms not included in the system's vocabulary.
In a term logic, the compositional and structural rules can be seen as variants of the syllogistic rules.
Intersection and union are dual operators, as in set theory, with respect to the extension and intension of a term.
The inference rules of NAL-3 include compositional, decompositional, and structural rules, as well as a choice rule that takes simplicity into account. The rules defined in lower layers remain valid when a compound is used as a whole.
In set theory, "relation" is defined similarly, except here in IL-NAL relation is not limited to sets defined extensionally.
The inference rules of IL-4 come from the definitions of the related compound term. Each of them only takes one premise.
Similarly, each inference rule of NAL-4 only takes one premise, and produce conclusions with the same truth-value, since the premise and the conclusions express the same content, though in different forms.
A semantic relation can be extensional, intensional, or both. The extension and intension of a concept mutually determines each other in IL, and their sizes change in the opposite direction. In NARS, they are defined differently from the conventional definition (which presumes model-theoretic semantics), while still keep the same intuitive meaning.
The cognitive processes usually called "recognition", "perception", and "categorization" can often be seen as answering a question of the form "T → ?" for a given term T, which can be a compound. There are often multiple answers that are not mutually exclusive, but form an inheritance hierarchy. The choice rule in NAL answers this type of questions by balancing the expectation and simplicity of the candidates. Some control factors, such as familiarity and relevance, also plays important roles, since the system cannot consider all candidates.
When deciding the degree of membership of T to a concept C in an inheritance hierarchy, there are two opposite tendencies: specificity (representativeness) and probability. The "Conjunction Fallacy" comes from the assumption that a concept is defined merely by its extension (instances). A compromise of the two tendencies: basic level categories.
The meaning of a compound term is semi-compositional: it is determined partly by the syntactic relations, and partly by the semantic or acquired relations with the compound as a whole, which usually cannot be fully derived from the former. The meaning of a compound term is initially determined fully by the syntactic relations, but later more and more by the semantic and acquired relations, which usually cannot be derived from the former.
Restricted by available resources, when processing a given task, each involved concept normally is used with partial meaning. Which part will be used is influenced by the priority distribution among beliefs, which depends on experience and context. For some concepts, there is a stable "core meaning", which correspond to its "essence" and "definition". When a concept is used with a meaning that differs from the norm, it corresponds to a "metaphorical" usage of the concept.
A useful concept usually have relatively sharp and balanced extension and intension, such as basic level categories and natural kinds.
The NARS categorization model takes the existing categorization models as special cases.
In NARS, all forms of empirical knowledge is producible and modifiable by experience (though can be implanted, too). At this level, learning is complete.
On the other hand, the grammar rules, inference rules, and control mechanisms are defined at the meta-level, which are not acquired, but built-in.
In NARS, learning and reasoning are basically two aspects of the same process. Learning is an open-ended process that does not follow any predetermined algorithm. This feature differs NARS from the conventional "machine learning" works, where "learning" is usually studied as following a fixed algorithm.
New concepts appears in the system in three ways: accepted, composed, altered. The "original meaning" of a concept is not necessarily its "current" meaning. In general, there is no "correct", "true", or "ultimate" meaning for a concept, though concepts with stable and clear meaning are preferred.
The leaning process selects useful concepts, based on repeatedly experienced patterns to summarize experience and to process tasks efficiently. The goal of learning is not "to know the world as it is", but "to adapt to the environment as the system needs".