Intervention studies in psychology often focus on identifying mechanisms that explain change over time. Cross-lagged panel models (CLPMs) are well suited to study mechanisms, but there is a controversy regarding the importance of detrending-defined here as separating longer-term time trends from cross-lagged effects-when modeling these change processes. The aim of this study was to present and test the arguments for and against detrending CLPMs in the presence of an intervention effect. We conducted Monte Carlo simulations to examine the impact of trends on estimates of cross-lagged effects from several longitudinal structural equation models. Our simulations suggested that ignoring time trends led to biased estimates of auto- and cross-lagged effects in some conditions, while detrending did not introduce bias in any of the models. We used real data from an intervention study to illustrate how detrending may affect results. This example showed that models that separated trends from cross-lagged effects fit better to the data and showed nonsignificant effect of the mechanism on outcome, while models that ignored trends showed significant effects. We conclude that ignoring trends increases the risk of bias in estimates of auto- and cross-lagged parameters and may lead to spurious findings. Researchers can test for the presence of trends by comparing model fit of models that take into account individual differences in trends (e.g., autoregressive latent trajectory model, the latent curve model with structured residuals, or the general cross-lagged model).