Region-optimized virtual (ROVir) coils¶

Author: Zimu Huo¶
Date: 01.2025¶

References

[1] 
Author:  Daeun Kim et al. 
Title: Region-optimized virtual (ROVir) coils: Localization and/or  suppression of spatial regions using sensor-domain beamforming
Link: https://onlinelibrary.wiley.com/doi/full/10.1002/mrm.28706
In [120]:
import sys
sys.path.insert(1, '../')
import numpy as np
import matplotlib.pyplot as plt
from util.coil import *
import util.mask as undersample
from util.fft import *
import util.simulator as simulate
from util.grappa import *
import util.phantom as phantom
In [121]:
slice1 = np.load("../lib/slice1_grappa1.npy")
[ny,nx,nc] = slice1.shape
data = slice1
rawImage = ifft2c(data)
acs = simulate.acs(data, (32, 32))
In [122]:
# fully sampled data
show(rawImage,4)
In [123]:
Amask = np.zeros([ny, nx, nc], dtype = complex)
Amask[64:150,64:150,:] = 1
Bmask = np.zeros([ny, nx, nc], dtype = complex)
Bmask[:,150:,:] = 1
In [124]:
ga = Amask * rawImage
ga = ga.reshape(-1, nc)
A = np.matrix(ga).getH() * ga
In [125]:
gb = Bmask * rawImage
gb = gb.reshape(-1, nc)
B = np.matrix(gb).getH() * gb

Derivation of Equation (13): Generalized Eigenvalue Equation¶

To derive $\mathbf{A} \mathbf{w}_j = \lambda_j \mathbf{B} \mathbf{w}_j$, follow these steps:


1. Start with the SIR Metric¶

The SIR metric to maximize is:

$$ \text{SIR}(\mathbf{w}_j) = \frac{\mathbf{w}_j^\mathrm{H} \mathbf{A} \mathbf{w}_j}{\mathbf{w}_j^\mathrm{H} \mathbf{B} \mathbf{w}_j}. $$

Here:

  • $\mathbf{w}_j$: complex weight vector for the $j$-th virtual coil.
  • $\mathbf{A}$: signal correlation matrix.
  • $\mathbf{B}$: interference correlation matrix.
  • $\mathbf{w}_j^\mathrm{H}$: Hermitian (conjugate transpose) of $\mathbf{w}_j$.

2. Differentiate the SIR Metric¶

To maximize $\text{SIR}(\mathbf{w}_j)$, compute the derivative with respect to $\mathbf{w}_j^\mathrm{H}$:

$$ \frac{\partial}{\partial \mathbf{w}_j^\mathrm{H}} \text{SIR}(\mathbf{w}_j) = \frac{\partial}{\partial \mathbf{w}_j^\mathrm{H}} \frac{\mathbf{w}_j^\mathrm{H} \mathbf{A} \mathbf{w}_j}{\mathbf{w}_j^\mathrm{H} \mathbf{B} \mathbf{w}_j}. $$

3. Apply the Quotient Rule¶

Using the quotient rule for derivatives:

$$ \frac{\partial}{\partial \mathbf{w}_j^\mathrm{H}} \text{SIR}(\mathbf{w}_j) = \frac{ \frac{\partial}{\partial \mathbf{w}_j^\mathrm{H}} \big( \mathbf{w}_j^\mathrm{H} \mathbf{A} \mathbf{w}_j \big) \cdot \big( \mathbf{w}_j^\mathrm{H} \mathbf{B} \mathbf{w}_j \big) - \big( \mathbf{w}_j^\mathrm{H} \mathbf{A} \mathbf{w}_j \big) \cdot \frac{\partial}{\partial \mathbf{w}_j^\mathrm{H}} \big( \mathbf{w}_j^\mathrm{H} \mathbf{B} \mathbf{w}_j \big) }{ \big( \mathbf{w}_j^\mathrm{H} \mathbf{B} \mathbf{w}_j \big)^2}. $$

4. Compute the Derivatives in the Numerator¶

Let’s compute each term:

  • First Term: $$ \frac{\partial}{\partial \mathbf{w}_j^\mathrm{H}} \big( \mathbf{w}_j^\mathrm{H} \mathbf{A} \mathbf{w}_j \big) = 2 \mathbf{A} \mathbf{w}_j. $$

  • Second Term: $$ \frac{\partial}{\partial \mathbf{w}_j^\mathrm{H}} \big( \mathbf{w}_j^\mathrm{H} \mathbf{B} \mathbf{w}_j \big) = 2 \mathbf{B} \mathbf{w}_j. $$

Substitute into the numerator:

$$ \frac{\partial}{\partial \mathbf{w}_j^\mathrm{H}} \text{SIR}(\mathbf{w}_j) = \frac{ 2 \mathbf{A} \mathbf{w}_j (\mathbf{w}_j^\mathrm{H} \mathbf{B} \mathbf{w}_j) - 2 \mathbf{B} \mathbf{w}_j (\mathbf{w}_j^\mathrm{H} \mathbf{A} \mathbf{w}_j) }{ \big( \mathbf{w}_j^\mathrm{H} \mathbf{B} \mathbf{w}_j \big)^2}. $$

5. Set the Derivative to Zero¶

At the maximum SIR, the derivative must be zero:

$$ 2 \mathbf{A} \mathbf{w}_j (\mathbf{w}_j^\mathrm{H} \mathbf{B} \mathbf{w}_j) - 2 \mathbf{B} \mathbf{w}_j (\mathbf{w}_j^\mathrm{H} \mathbf{A} \mathbf{w}_j) = 0. $$

Cancel the factor of 2 and rearrange:

$$ \mathbf{A} \mathbf{w}_j (\mathbf{w}_j^\mathrm{H} \mathbf{B} \mathbf{w}_j) = \mathbf{B} \mathbf{w}_j (\mathbf{w}_j^\mathrm{H} \mathbf{A} \mathbf{w}_j). $$

6. Divide Through by $\mathbf{w}_j^\mathrm{H} \mathbf{B} \mathbf{w}_j$¶

Since $\mathbf{w}_j^\mathrm{H} \mathbf{B} \mathbf{w}_j$ is a scalar, divide through to obtain:

$$ \mathbf{A} \mathbf{w}_j = \lambda_j \mathbf{B} \mathbf{w}_j, $$

where:

$$ \lambda_j = \frac{\mathbf{w}_j^\mathrm{H} \mathbf{A} \mathbf{w}_j}{\mathbf{w}_j^\mathrm{H} \mathbf{B} \mathbf{w}_j}. $$

Final Result¶

This is the generalized eigenvalue equation, where:

  • $\mathbf{w}_j$: eigenvectors (optimal weights).
  • $\lambda_j$: eigenvalues (maximum achievable SIR for each $\mathbf{w}_j$).
In [132]:
import numpy as np
from scipy.linalg import eigh

# Solve the generalized eigenvalue problem
# eigh solves A w = λ B w for Hermitian matrices A and B
eigvals, eigvecs = eigh(A, B)

# eigvals contains the eigenvalues (λ_j)
# eigvecs contains the eigenvectors (w_j), each column corresponds to an eigenvector

sorted_indices = np.argsort(eigvals)[::-1]  # Indices of sorted eigenvalues in descending order
eigvals = eigvals[sorted_indices]          # Sort eigenvalues
eigvecs = eigvecs[:, sorted_indices]       # Sort eigenvectors accordingly

# Keep the top x principal eigenvalues and eigenvectors
nv = 5  # Change this to keep more eigenvalues/eigenvectors
top_eigvals = eigvals[:nv]                  # Top x eigenvalues
top_eigvecs = eigvecs[:, :nv]               # Corresponding eigenvectors
In [133]:
plt.plot(eigvals)
Out[133]:
[<matplotlib.lines.Line2D at 0x7fb2a8f788e0>]
In [134]:
w = top_eigvecs
In [135]:
sf = rawImage.reshape(-1, nc)
wsf = sf @ w
In [136]:
img = wsf.reshape(ny,nx,nv)
In [145]:
plt.figure(figsize=(16, 12))
plt.subplot(141)
plt.title("fully sampled")
plt.imshow(rsos(rawImage), cmap = "gray")
plt.subplot(142)
plt.title("Signal mask")
plt.imshow(rsos(Amask), cmap = "gray")
plt.subplot(143)
plt.title("Interference mask")
plt.imshow(rsos(Bmask), cmap = "gray")
plt.subplot(144)
plt.title("After beam forming")
plt.imshow(rsos(img), cmap = "gray")
Out[145]:
<matplotlib.image.AxesImage at 0x7fb299dbc100>
In [146]:
# double check 

#original energy to interference ratio
print(np.sum(rsos(rawImage) * rsos(Amask))/ np.sum(rsos(rawImage) * rsos(Bmask)))

#energy to interference ratio after
print(np.sum(rsos(img) * rsos(Amask))/ np.sum(rsos(img) * rsos(Bmask)))
0.48775399183141427
2.521213956670469
In [ ]: