From 8b71512b08d1fe5e03f932d5b40a0a5a2b99df7a Mon Sep 17 00:00:00 2001 From: jakobrunge Date: Thu, 14 Apr 2022 17:53:52 +0200 Subject: [PATCH 1/2] updated readme and LPCMCI tutorial --- tutorials/tigramite_tutorial_latent-pcmci.ipynb | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/tutorials/tigramite_tutorial_latent-pcmci.ipynb b/tutorials/tigramite_tutorial_latent-pcmci.ipynb index ff5a8f22..9cbdd7ef 100644 --- a/tutorials/tigramite_tutorial_latent-pcmci.ipynb +++ b/tutorials/tigramite_tutorial_latent-pcmci.ipynb @@ -1898,7 +1898,7 @@ "We see that, unlike with the default setting, the algorithm fails to detect the edge $X^0_{t-1} {\\leftrightarrow} X^1_t$. This is a manifestion of what has been discussed in subsection 2.3: A low effect size due to autocorrelation of the time series. With the default settings, where `n_preliminary_iterations = 1`, this problem is addressed by restoring all removed edges after the preliminary phase and then working with larger default conditioning sets $\\mathcal{S}_{def}(X^i_{t-\\tau}, X^j_t)$ in the final ancestral phase (step 3.) of the algorithm.\n", "\n", "For those interested in validating this point in more detail:\\\n", - "The edge between $X^0_{t-1}$ and $X^1_t$ is removed because the (conditional) independence test wrongly judges $X^0_{t-1}$ and $X^1_t$ to be independent conditional on $X^0_{t-2}$. To see this set `verbosity = 1` when creating the LPCMCI object in the previous two cells and search for the line `(0,-1) independent (1, 0) given ((0, -2),) union set()` in the verbose output, or set `verbosity = 2` and search for `ANC(Y): (0, -1) _|_ (1, 0) | S_def = , S_pc = (0, -2): val = 0.10 / pval = 0.0287` (here, \"S_pc\" refers to the standard conditioning set $\\mathcal{S}$ and \"S_def\" to the default conditions $\\mathcal{S}_{def}$). Then set `verbosity = 2` in the default applications of LPCMCI in subsections 3.4 and 3.5 above and see that the same happens in the preliminary phases of these runs. However, in these cases the algorithm restores this edge before moving to the final ancestral phase while it remembers that $X^0_{t-2}$ is a causal ancestor of $X^0_{t-1}$ and that $X^1_{t-1}$ is a causal ancestor of $X^1_{t}$. The final ancestral phase therefore uses $\\mathcal{S}_{def}(X^0_{t-1}, X^0_t) = \\{X^0_{t-2}, X^1_{t-1}\\}$ and never tests whether $X^0_{t-1}$ and $X^1_t$ are conditionally independent given $X^0_{t-2}$. Indeed, search for `ANC(Y): (0, -1) _|_ (1, 0) | S_def = (0, -2) (1, -1), S_pc = : val = 0.19 / pval = 0.0000` in the verbose output.\n", + "The edge between $X^0_{t-1}$ and $X^1_t$ is removed because the (conditional) independence test wrongly judges $X^0_{t-1}$ and $X^1_t$ to be independent conditional on $X^0_{t-2}$. To see this set `verbosity = 1` when creating the LPCMCI object in the previous two cells and search for the line `(0,-1) independent (1, 0) given ((0, -2),) union set()` in the verbose output, or set ``verbosity = 2`` and search for ``ANC(Y): (0, -1) _|_ (1, 0) | S_def = , S_pc = (0, -2): val = 0.10 / pval = 0.0287`` (here, ``S_pc`` refers to the standard conditioning set $\\mathcal{S}$ and ``S_def`` to the default conditions $\\mathcal{S}_{def}$). Then set `verbosity = 2` in the default applications of LPCMCI in subsections 3.4 and 3.5 above and see that the same happens in the preliminary phases of these runs. However, in these cases the algorithm restores this edge before moving to the final ancestral phase while it remembers that $X^0_{t-2}$ is a causal ancestor of $X^0_{t-1}$ and that $X^1_{t-1}$ is a causal ancestor of $X^1_{t}$. The final ancestral phase therefore uses $\\mathcal{S}_{def}(X^0_{t-1}, X^0_t) = \\{X^0_{t-2}, X^1_{t-1}\\}$ and never tests whether $X^0_{t-1}$ and $X^1_t$ are conditionally independent given $X^0_{t-2}$. Indeed, search for ``ANC(Y): (0, -1) _|_ (1, 0) | S_def = (0, -2) (1, -1), S_pc = : val = 0.19 / pval = 0.0000`` in the verbose output.\n", "\n", "For `tau_max = 2` it is further wrongly inferred that $X^2_{t-2}$ and $X^1_t$ are adjacent." ] From 8b2240895d424e5361f6f85d2bc550fe85866e1f Mon Sep 17 00:00:00 2001 From: jakobrunge Date: Thu, 14 Apr 2022 17:55:19 +0200 Subject: [PATCH 2/2] updated readme and LPCMCI tutorial --- tutorials/tigramite_tutorial_latent-pcmci.ipynb | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/tutorials/tigramite_tutorial_latent-pcmci.ipynb b/tutorials/tigramite_tutorial_latent-pcmci.ipynb index 9cbdd7ef..616182af 100644 --- a/tutorials/tigramite_tutorial_latent-pcmci.ipynb +++ b/tutorials/tigramite_tutorial_latent-pcmci.ipynb @@ -17,11 +17,11 @@ "\n", "This tutorial explains the **Latent-PCMCI (LPCMCI) algorithm**, which is implemented as the function `LPCMCI.run_lpcmci`. In contrast to the [PCMCI](https://github.com/jakobrunge/tigramite/blob/master/tutorials/tigramite_tutorial_basics.ipynb) and [PCMCIplus](https://github.com/jakobrunge/tigramite/blob/master/tutorials/tigramite_tutorial_pcmciplus.ipynb) algorithms, respectively implemented as `PCMCI.run_pcmci` and `PCMCI.run_pcmciplus`, LPCMCI allows for unobserved (aka latent) time series.\n", "\n", - "**Note:**\\\n", + "**Note:**\n", "This method is still experimental since the default settings of hyperparameters are still being fine-tuned. Feedback on this matter is kindly appreciated.\n", "\n", "---\n", - "**Publication on LPCMCI:**\\\n", + "**Publication on LPCMCI:**\n", "Gerhardus, Andreas and Runge, Jakob (2020). High-recall causal discovery for autocorrelated time series with latent confounders. In Larochelle, H., Ranzato, M., Hadsell, R., Balcan, M. F., and Lin, H., editors, *Advances in Neural Information Processing Systems*, volume 33, pages 12615–12625. Curran Associates, Inc. [https://proceedings.neurips.cc/paper/2020/file/94e70705efae423efda1088614128d0b-Paper.pdf](https://proceedings.neurips.cc/paper/2020/file/94e70705efae423efda1088614128d0b-Paper.pdf).\n", "\n", "---\n",