It is natural to look for a class of structural equation models with favorable identifiability and properties.
A $d$-dimensional rooted tree linear cascade is a sequence of four objects: a tree on $\set{1, \dots , d}$, a vertex of the tree, a family of real numbers indexed by the edges of the tree, and a $d$-dimensional random vector whose covariance matrix is the identity matrix. The cascade is called “$d$-dimensional” because we associate it with a random vector (defined as a function of that in the form of its definition) whose codomain is $\R ^d$.
The tree together with the vertex form a rooted tree. The graph associated with the rooted tree and the family of real numbers together form a weighted graph.
The idea is to use the weights and the tree
structure to recursively define a random vector
in terms of elements of the given random vector.
Let $C = (T, i, w, e)$ be a $d$-dimensional
rooted tree linear cascade.
So $T$ is a tree on $\set{1, \dots , d}$, $i
\in \set{1, \dots , d}$ and $w: T \to \R $,
and $e: \Omega \to \R ^d$ for some probability
space $(A, \mathcal{A} , \mathbfsf{P} )$.
The random vector associated with $C$ is the
random variable $x: \Omega \to \R ^d$ defined
by
\[
x_i = e_i \text{ and } x_j = w_{\set{\pa{j}, j}}x_{\pa{j}} +
e_{j} \text{ for } j \neq i.
\] \[
e = Ax
\]
Let $(A, \mathcal{A} , \mathbfsf{P} )$ be a probability space. Let $e: A \to \R ^d$ be a random vector, let $T$ be a tree on $\set{1, \dots , d}$ with $a_{ij} = a_{ji}$ the weight on edge $\set{i, j} \in T$.