Abstract:For a double complex $(A, d', d'')$, we show that if it satisfies the $d'd''$-lemma and the spectral sequence $\{E^{p, q}_r\}$ induced by $A$ does not degenerate at $E_0$, then it degenerates at $E_1$. We apply this result to prove the degeneration at $E_1$ of a Hodge-de Rham spectral sequence on compact bi-generalized Hermitian manifolds that satisfy a version of $d'd''$-lemma.
Keywords: $\partial\overline{\partial}$-lemma; Hodge-de Rham spectral sequence; $E_1$-degeneration; bi-generalized Hermitian manifold
DOI: DOI 10.14712/1213-7243.2015.156
AMS Subject Classification: 55T05 53C05