As @rnva points out these are not the same quantities. To give some clarity as to why they are both referred to as $D_{\min}$ it is best to look at the as limiting cases of $\alpha$-R'enyi divergences.
First, we have the sandwiched divergences which for $\alpha \in (0, 1) \cup (1, \infty)$ are defined as
$$
\widetilde{D}_{\alpha}(\rho\|\sigma) = \frac{1}{\alpha - 1} \log \mathrm{Tr}\left[ (\sigma^{\frac{1-\alpha}{2\alpha}} \rho \sigma^{\frac{1-\alpha}{2\alpha}} )^\alpha \right].
$$
These divergences are monotonically increasing in $\alpha$ and satisfy the data processing inequality (DPI) for all $\alpha \geq 1/2$. Thus the smallest divergence in this family satisfying the DPI is
$$
\widetilde{D}_{\min}(\rho \| \sigma) = \widetilde{D}_{1/2}(\rho \|\sigma) = - \log \mathrm{Tr}[\sqrt{\rho} \sqrt{\sigma}]^2.
$$
Another well studied family of divergences are the so-called Petz divergences defined for $\alpha \in (0,1) \cup (1, \infty)$ to be
$$
\overline{D}_{\alpha}(\rho \| \sigma) = \frac{1}{\alpha - 1} \log \mathrm{Tr}[\rho^{\alpha} \sigma^{1-\alpha}].
$$
This family satisfies the DPI for $\alpha \in (0,1) \cup(1,2]$ and they are also monotonically increasing in $\alpha$. Thus, the smallest divergence satisfying the DPI in this family is
$$
\overline{D}_{\min}(\rho \| \sigma) = \lim_{\alpha \to 0^+} \overline{D}_{\alpha}(\rho \|\sigma) = -\log \mathrm{Tr}[\Pi_\rho \sigma ].
$$