In the mathematics of binary relations, the composition of relations is the forming of a new binary relation R; S from two given binary relations R and S. In the calculus of relations, the composition of relations is called relative multiplication,[1] and its result is called a relative product.[2]: 40  Function composition is the special case of composition of relations where all relations involved are functions.

The word uncle indicates a compound relation: for a person to be an uncle, he must be the brother of a parent. In algebraic logic it is said that the relation of Uncle (${\displaystyle xUz}$) is the composition of relations "is a brother of" (${\displaystyle xBy}$) and "is a parent of" (${\displaystyle yPz}$). ${\displaystyle U=BP\quad {\text{ is equivalent to: ))\quad xByPz{\text{ if and only if ))xUz.}$

Beginning with Augustus De Morgan,[3] the traditional form of reasoning by syllogism has been subsumed by relational logical expressions and their composition.[4]

## Definition

If ${\displaystyle R\subseteq X\times Y}$ and ${\displaystyle S\subseteq Y\times Z}$ are two binary relations, then their composition ${\displaystyle R;S}$ is the relation ${\displaystyle R;S=\{(x,z)\in X\times Z:{\text{ there exists ))y\in Y{\text{ such that ))(x,y)\in R{\text{ and ))(y,z)\in S\}.}$

In other words, ${\displaystyle R;S\subseteq X\times Z}$ is defined by the rule that says ${\displaystyle (x,z)\in R;S}$ if and only if there is an element ${\displaystyle y\in Y}$ such that ${\displaystyle x\,R\,y\,S\,z}$ (that is, ${\displaystyle (x,y)\in R}$ and ${\displaystyle (y,z)\in S}$).[5]: 13

### Notational variations

The semicolon as an infix notation for composition of relations dates back to Ernst Schroder's textbook of 1895.[6] Gunther Schmidt has renewed the use of the semicolon, particularly in Relational Mathematics (2011).[2]: 40 [7] The use of semicolon coincides with the notation for function composition used (mostly by computer scientists) in category theory,[8] as well as the notation for dynamic conjunction within linguistic dynamic semantics.[9]

A small circle ${\displaystyle (R\circ S)}$ has been used for the infix notation of composition of relations by John M. Howie in his books considering semigroups of relations.[10] However, the small circle is widely used to represent composition of functions ${\displaystyle g(f(x))=(g\circ f)(x)}$ which reverses the text sequence from the operation sequence. The small circle was used in the introductory pages of Graphs and Relations[5]: 18  until it was dropped in favor of juxtaposition (no infix notation). Juxtaposition ${\displaystyle (RS)}$ is commonly used in algebra to signify multiplication, so too, it can signify relative multiplication.

Further with the circle notation, subscripts may be used. Some authors[11] prefer to write ${\displaystyle \circ _{l))$ and ${\displaystyle \circ _{r))$ explicitly when necessary, depending whether the left or the right relation is the first one applied. A further variation encountered in computer science is the Z notation: ${\displaystyle \circ }$ is used to denote the traditional (right) composition, while left composition is denoted by a fat semicolon. The unicode symbols are ⨾ and ⨟.[12][13]

### Mathematical generalizations

Binary relations ${\displaystyle R\subseteq X\times Y}$ are morphisms ${\displaystyle R:X\to Y}$ in the category ${\displaystyle {\mathsf {Rel))}$. In Rel the objects are sets, the morphisms are binary relations and the composition of morphisms is exactly composition of relations as defined above. The category Set of sets and functions is a subcategory of ${\displaystyle {\mathsf {Rel))}$ where the maps ${\displaystyle X\to Y}$ are functions ${\displaystyle f:X\to Y}$.

Given a regular category ${\displaystyle \mathbb {X} }$, its category of internal relations ${\displaystyle {\mathsf {Rel))(\mathbb {X} )}$ has the same objects as ${\displaystyle \mathbb {X} }$, but now the morphisms ${\displaystyle X\to Y}$ are given by subobjects ${\displaystyle R\subseteq X\times Y}$ in ${\displaystyle \mathbb {X} }$.[14] Formally, these are jointly monic spans between ${\displaystyle X}$ and ${\displaystyle Y}$. Categories of internal relations are allegories. In particular ${\displaystyle {\mathsf {Rel))({\mathsf {Set)))\cong {\mathsf {Rel))}$. Given a field ${\displaystyle k}$ (or more generally a principal ideal domain), the category of relations internal to matrices over ${\displaystyle k}$, ${\displaystyle {\mathsf {Rel))({\mathsf {Mat))(k))}$ has morphisms ${\displaystyle n\to m}$ linear subspaces ${\displaystyle R\subseteq k^{n}\oplus k^{m))$. The category of linear relations over the finite field ${\displaystyle \mathbb {F} _{2))$ is isomorphic to the phase-free qubit ZX-calculus modulo scalars.

## Properties

• Composition of relations is associative: ${\displaystyle R;(S;T)=(R;S);T.}$
• The converse relation of ${\displaystyle R\,;S}$ is ${\displaystyle (R\,;S)^{\textsf {T))=S^{\textsf {T))\,;R^{\textsf {T)).}$ This property makes the set of all binary relations on a set a semigroup with involution.
• The composition of (partial) functions (that is, functional relations) is again a (partial) function.
• If ${\displaystyle R}$ and ${\displaystyle S}$ are injective, then ${\displaystyle R\,;S}$ is injective, which conversely implies only the injectivity of ${\displaystyle R.}$
• If ${\displaystyle R}$ and ${\displaystyle S}$ are surjective, then ${\displaystyle R\,;S}$ is surjective, which conversely implies only the surjectivity of ${\displaystyle S.}$
• The set of binary relations on a set ${\displaystyle X}$ (that is, relations from ${\displaystyle X}$ to ${\displaystyle X}$) together with (left or right) relation composition forms a monoid with zero, where the identity map on ${\displaystyle X}$ is the neutral element, and the empty set is the zero element.

## Composition in terms of matrices

Finite binary relations are represented by logical matrices. The entries of these matrices are either zero or one, depending on whether the relation represented is false or true for the row and column corresponding to compared objects. Working with such matrices involves the Boolean arithmetic with ${\displaystyle 1+1=1}$ and ${\displaystyle 1\times 1=1.}$ An entry in the matrix product of two logical matrices will be 1, then, only if the row and column multiplied have a corresponding 1. Thus the logical matrix of a composition of relations can be found by computing the matrix product of the matrices representing the factors of the composition. "Matrices constitute a method for computing the conclusions traditionally drawn by means of hypothetical syllogisms and sorites."[15]

## Heterogeneous relations

Consider a heterogeneous relation ${\displaystyle R\subseteq A\times B;}$ that is, where ${\displaystyle A}$ and ${\displaystyle B}$ are distinct sets. Then using composition of relation ${\displaystyle R}$ with its converse ${\displaystyle R^{\textsf {T)),}$ there are homogeneous relations ${\displaystyle RR^{\textsf {T))}$ (on ${\displaystyle A}$) and ${\displaystyle R^{\textsf {T))R}$ (on ${\displaystyle B}$).

If for all ${\displaystyle x\in A}$ there exists some ${\displaystyle y\in B,}$ such that ${\displaystyle xRy}$ (that is, ${\displaystyle R}$ is a (left-)total relation), then for all ${\displaystyle x,xRR^{\textsf {T))x}$ so that ${\displaystyle RR^{\textsf {T))}$ is a reflexive relation or ${\displaystyle I\subseteq RR^{\textsf {T))}$ where I is the identity relation ${\displaystyle \{(x,x):x\in A\}.}$ Similarly, if ${\displaystyle R}$ is a surjective relation then ${\displaystyle R^{\textsf {T))R\supseteq I=\{(x,x):x\in B\}.}$ In this case ${\displaystyle R\subseteq RR^{\textsf {T))R.}$ The opposite inclusion occurs for a difunctional relation.

The composition ${\displaystyle {\bar {R))^{\textsf {T))R}$ is used to distinguish relations of Ferrer's type, which satisfy ${\displaystyle R{\bar {R))^{\textsf {T))R=R.}$

### Example

Let ${\displaystyle A=}$ { France, Germany, Italy, Switzerland } and ${\displaystyle B=}$ { French, German, Italian } with the relation ${\displaystyle R}$ given by ${\displaystyle aRb}$ when ${\displaystyle b}$ is a national language of ${\displaystyle a.}$ Since both ${\displaystyle A}$ and ${\displaystyle B}$ is finite, ${\displaystyle R}$ can be represented by a logical matrix, assuming rows (top to bottom) and columns (left to right) are ordered alphabetically: ${\displaystyle {\begin{pmatrix}1&0&0\\0&1&0\\0&0&1\\1&1&1\end{pmatrix)).}$

The converse relation ${\displaystyle R^{\textsf {T))}$ corresponds to the transposed matrix, and the relation composition ${\displaystyle R^{\textsf {T));R}$ corresponds to the matrix product ${\displaystyle R^{\textsf {T))R}$ when summation is implemented by logical disjunction. It turns out that the ${\displaystyle 3\times 3}$ matrix ${\displaystyle R^{\textsf {T))R}$ contains a 1 at every position, while the reversed matrix product computes as: ${\displaystyle RR^{\textsf {T))={\begin{pmatrix}1&0&0&1\\0&1&0&1\\0&0&1&1\\1&1&1&1\end{pmatrix)).}$ This matrix is symmetric, and represents a homogeneous relation on ${\displaystyle A.}$

Correspondingly, ${\displaystyle R^{\textsf {T))\,;R}$ is the universal relation on ${\displaystyle B,}$ hence any two languages share a nation where they both are spoken (in fact: Switzerland). Vice versa, the question whether two given nations share a language can be answered using ${\displaystyle R\,;R^{\textsf {T)).}$

## Schröder rules

For a given set ${\displaystyle V,}$ the collection of all binary relations on ${\displaystyle V}$ forms a Boolean lattice ordered by inclusion ${\displaystyle (\subseteq ).}$ Recall that complementation reverses inclusion: ${\displaystyle A\subseteq B{\text{ implies ))B^{\complement }\subseteq A^{\complement }.}$ In the calculus of relations[16] it is common to represent the complement of a set by an overbar: ${\displaystyle {\bar {A))=A^{\complement }.}$

If ${\displaystyle S}$ is a binary relation, let ${\displaystyle S^{\textsf {T))}$ represent the converse relation, also called the transpose. Then the Schröder rules are ${\displaystyle QR\subseteq S\quad {\text{ is equivalent to ))\quad Q^{\textsf {T)){\bar {S))\subseteq {\bar {R))\quad {\text{ is equivalent to ))\quad {\bar {S))R^{\textsf {T))\subseteq {\bar {Q)).}$ Verbally, one equivalence can be obtained from another: select the first or second factor and transpose it; then complement the other two relations and permute them.[5]: 15–19

Though this transformation of an inclusion of a composition of relations was detailed by Ernst Schröder, in fact Augustus De Morgan first articulated the transformation as Theorem K in 1860.[4] He wrote[17] ${\displaystyle LM\subseteq N{\text{ implies )){\bar {N))M^{\textsf {T))\subseteq {\bar {L)).}$

With Schröder rules and complementation one can solve for an unknown relation ${\displaystyle X}$ in relation inclusions such as ${\displaystyle RX\subseteq S\quad {\text{and))\quad XR\subseteq S.}$ For instance, by Schröder rule ${\displaystyle RX\subseteq S{\text{ implies ))R^{\textsf {T)){\bar {S))\subseteq {\bar {X)),}$ and complementation gives ${\displaystyle X\subseteq {\overline {R^{\textsf {T)){\bar {S)))),}$ which is called the left residual of ${\displaystyle S}$ by ${\displaystyle R}$.

## Quotients

Just as composition of relations is a type of multiplication resulting in a product, so some operations compare to division and produce quotients. Three quotients are exhibited here: left residual, right residual, and symmetric quotient. The left residual of two relations is defined presuming that they have the same domain (source), and the right residual presumes the same codomain (range, target). The symmetric quotient presumes two relations share a domain and a codomain.

Definitions:

• Left residual: ${\displaystyle A\backslash B\mathrel {:=} {\overline {A^{\textsf {T)){\bar {B))))}$
• Right residual: ${\displaystyle D/C\mathrel {:=} {\overline ((\bar {D))C^{\textsf {T))))}$
• Symmetric quotient: ${\displaystyle \operatorname {syq} (E,F)\mathrel {:=} {\overline {E^{\textsf {T)){\bar {F))))\cap {\overline ((\bar {E))^{\textsf {T))F))}$

Using Schröder's rules, ${\displaystyle AX\subseteq B}$ is equivalent to ${\displaystyle X\subseteq A\backslash B.}$ Thus the left residual is the greatest relation satisfying ${\displaystyle AX\subseteq B.}$ Similarly, the inclusion ${\displaystyle YC\subseteq D}$ is equivalent to ${\displaystyle Y\subseteq D/C,}$ and the right residual is the greatest relation satisfying ${\displaystyle YC\subseteq D.}$[2]: 43–6

One can practice the logic of residuals with Sudoku.[further explanation needed]

## Join: another form of composition

A fork operator ${\displaystyle (<)}$ has been introduced to fuse two relations ${\displaystyle c:H\to A}$ and ${\displaystyle d:H\to B}$ into ${\displaystyle c\,(<)\,d:H\to A\times B.}$ The construction depends on projections ${\displaystyle a:A\times B\to A}$ and ${\displaystyle b:A\times B\to B,}$ understood as relations, meaning that there are converse relations ${\displaystyle a^{\textsf {T))}$ and ${\displaystyle b^{\textsf {T)).}$ Then the fork of ${\displaystyle c}$ and ${\displaystyle d}$ is given by[18] ${\displaystyle c\,(<)\,d~\mathrel {:=} ~c;a^{\textsf {T))\cap \ d;b^{\textsf {T)).}$

Another form of composition of relations, which applies to general ${\displaystyle n}$-place relations for ${\displaystyle n\geq 2,}$ is the join operation of relational algebra. The usual composition of two binary relations as defined here can be obtained by taking their join, leading to a ternary relation, followed by a projection that removes the middle component. For example, in the query language SQL there is the operation Join (SQL).

## Notes

1. ^ Bjarni Jónssen (1984) "Maximal Algebras of Binary Relations", in Contributions to Group Theory, K.I. Appel editor American Mathematical Society ISBN 978-0-8218-5035-0
2. ^ a b c Gunther Schmidt (2011) Relational Mathematics, Encyclopedia of Mathematics and its Applications, vol. 132, Cambridge University Press ISBN 978-0-521-76268-7
3. ^ A. De Morgan (1860) "On the Syllogism: IV and on the Logic of Relations"
4. ^ a b Daniel D. Merrill (1990) Augustus De Morgan and the Logic of Relations, page 121, Kluwer Academic ISBN 9789400920477
5. ^ a b c Gunther Schmidt & Thomas Ströhlein (1993) Relations and Graphs, Springer books
6. ^
7. ^ Paul Taylor (1999). Practical Foundations of Mathematics. Cambridge University Press. p. 24. ISBN 978-0-521-63107-5. A free HTML version of the book is available at http://www.cs.man.ac.uk/~pt/Practical_Foundations/
8. ^ Michael Barr & Charles Wells (1998) Category Theory for Computer Scientists Archived 2016-03-04 at the Wayback Machine, page 6, from McGill University
9. ^ Rick Nouwen and others (2016) Dynamic Semantics §2.2, from Stanford Encyclopedia of Philosophy
10. ^ John M. Howie (1995) Fundamentals of Semigroup Theory, page 16, LMS Monograph #12, Clarendon Press ISBN 0-19-851194-9
11. ^ Kilp, Knauer & Mikhalev, p. 7
12. ^ ISO/IEC 13568:2002(E), p. 23
13. ^ See U+2A3E and U+2A1F on FileFormat.info
14. ^ "internal relations". nlab. Retrieved 26 September 2023.
15. ^ Irving Copilowish (December 1948) "Matrix development of the calculus of relations", Journal of Symbolic Logic 13(4): 193–203 Jstor link, quote from page 203
16. ^
17. ^ De Morgan indicated contraries by lower case, conversion as M−1, and inclusion with )), so his notation was ${\displaystyle nM^{-1}))\ l.}$
18. ^ Gunther Schmidt and Michael Winter (2018): Relational Topology, page 26, Lecture Notes in Mathematics vol. 2208, Springer books, ISBN 978-3-319-74451-3

## References

• M. Kilp, U. Knauer, A.V. Mikhalev (2000) Monoids, Acts and Categories with Applications to Wreath Products and Graphs, De Gruyter Expositions in Mathematics vol. 29, Walter de Gruyter,ISBN 3-11-015248-7.