Let $\mathbf{U}$ be a random unitary matrix and $\mathbf{z}$ be a random i.i.d complex Gaussian vector (unitary invariant). Assume that the following relation is satisfied: \begin{align} \mathbf{y}=\mathbf{U}\mathbf{s}+\mathbf{z}. \end{align} Are the following relations hold for the mutual information between $\mathbf{s}$ and $\mathbf{y}$? \begin{align} I(\mathbf{s};\mathbf{y})&=I(\mathbf{s};\mathbf{U}^{\mathrm{H}}\mathbf{y})\\ &=I(\mathbf{s};\mathbf{s}+\mathbf{U}^{\mathrm{H}}\mathbf{z})\\ &=I(\mathbf{s};\mathbf{s}+\mathbf{z})\\ &=h(\mathbf{s}+\mathbf{z})h(\mathbf{z}), \end{align} where $(.)^{\mathrm{H}}$ is a hermitian opertor.

$\begingroup$ This relationship is clearly true for deterministic $\mathbf{U}$. $\endgroup$– Math_YMay 13 at 23:40
No, this is not correct. Consider as a counterexample the case that $s$ can take only two values, a unit vector $e$ or minus $e$. Since $s$ is rotated randomly to construct $y=Us+z$, knowledge of $s$ gives you no information on $y$, so the mutual information $I(s,y)=0$.
On the other hand, knowledge of $s$ does give you information on the sum $s+z=\pm e+z$, so $I(s,s+z)\neq 0$, contradicting the third equality in the OP.
The error appears already in the first equality: $I(s,y)=I(s,U_0y)$ for any given unitary $U_0$, but this is not the same unitary as in the construction $y=Us+z$, since that $U$ is unknown.