#### On a Vectorized Version of a Generalized Richardson Extrapolation Process

##### Avram Sidi

Let $\{\xx_m\}$ be a vector sequence that satisfies $$\xx_m\sim \sss+\sum^\infty_{i=1}\alpha_i \gg_i(m)\quad\text{as m\to\infty},$$ $\sss$ being the limit or antilimit of $\{\xx_m\}$ and $\{\gg_i(m)\}^\infty_{i=1}$ being an asymptotic scale as $m\to\infty$, in the sense that $$\lim_{m\to\infty}\frac{\|\gg_{i+1}(m)\|}{\|\gg_{i}(m)\|}=0,\quad i=1,2,\ldots.$$ The vector sequences $\{\gg_i(m)\}^\infty_{m=0}$, $i=1,2,\ldots,$ are known, as well as $\{\xx_m\}$. In this work, we analyze the convergence and convergence acceleration properties of a vectorized version of the generalized Richardson extrapolation process that is defined via the equations $$\sum^k_{i=1}\braket{\yy,\Delta\gg_{i}(m)}\widetilde{\alpha}_i=\braket{\yy,\Delta\xx_m},\quad n\leq m\leq n+k-1;\quad \sss_{n,k}=\xx_n+\sum^k_{i=1}\widetilde{\alpha}_i\gg_{i}(n),$$ $\sss_{n,k}$ being the approximation to $\sss$. Here $\yy$ is some nonzero vector, $\braket{\cdot\,,\cdot}$ is an inner product, such that $\braket{\alpha\aaa,\beta\bb}=\bar{\alpha}\beta\braket{\aaa,\bb}$, and $\Delta\xx_m=\xx_{m+1}-~\xx_m$ and $\Delta\gg_i(m)=\gg_i(m+1)-\gg_i(m)$. By imposing a minimal number of reasonable additional conditions on the $\gg_i(m)$, we show that the error $\sss_{n,k}-\sss$ has a full asymptotic expansion as $n\to\infty$. We also show that actual convergence acceleration takes place and we provide a complete classification of it.

arrow_drop_up