Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Developer #211

Merged
merged 2 commits into from
Apr 14, 2022
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions tutorials/tigramite_tutorial_latent-pcmci.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -17,11 +17,11 @@
"\n",
"This tutorial explains the **Latent-PCMCI (LPCMCI) algorithm**, which is implemented as the function `LPCMCI.run_lpcmci`. In contrast to the [PCMCI](https://github.com/jakobrunge/tigramite/blob/master/tutorials/tigramite_tutorial_basics.ipynb) and [PCMCIplus](https://github.com/jakobrunge/tigramite/blob/master/tutorials/tigramite_tutorial_pcmciplus.ipynb) algorithms, respectively implemented as `PCMCI.run_pcmci` and `PCMCI.run_pcmciplus`, LPCMCI allows for unobserved (aka latent) time series.\n",
"\n",
"**Note:**\\\n",
"**Note:**\n",
"This method is still experimental since the default settings of hyperparameters are still being fine-tuned. Feedback on this matter is kindly appreciated.\n",
"\n",
"---\n",
"**Publication on LPCMCI:**\\\n",
"**Publication on LPCMCI:**\n",
"Gerhardus, Andreas and Runge, Jakob (2020). High-recall causal discovery for autocorrelated time series with latent confounders. In Larochelle, H., Ranzato, M., Hadsell, R., Balcan, M. F., and Lin, H., editors, *Advances in Neural Information Processing Systems*, volume 33, pages 12615–12625. Curran Associates, Inc. [https://proceedings.neurips.cc/paper/2020/file/94e70705efae423efda1088614128d0b-Paper.pdf](https://proceedings.neurips.cc/paper/2020/file/94e70705efae423efda1088614128d0b-Paper.pdf).\n",
"\n",
"---\n",
Expand Down Expand Up @@ -1898,7 +1898,7 @@
"We see that, unlike with the default setting, the algorithm fails to detect the edge $X^0_{t-1} {\\leftrightarrow} X^1_t$. This is a manifestion of what has been discussed in subsection 2.3: A low effect size due to autocorrelation of the time series. With the default settings, where `n_preliminary_iterations = 1`, this problem is addressed by restoring all removed edges after the preliminary phase and then working with larger default conditioning sets $\\mathcal{S}_{def}(X^i_{t-\\tau}, X^j_t)$ in the final ancestral phase (step 3.) of the algorithm.\n",
"\n",
"For those interested in validating this point in more detail:\\\n",
"The edge between $X^0_{t-1}$ and $X^1_t$ is removed because the (conditional) independence test wrongly judges $X^0_{t-1}$ and $X^1_t$ to be independent conditional on $X^0_{t-2}$. To see this set `verbosity = 1` when creating the LPCMCI object in the previous two cells and search for the line `(0,-1) independent (1, 0) given ((0, -2),) union set()` in the verbose output, or set `verbosity = 2` and search for `ANC(Y): (0, -1) _|_ (1, 0) | S_def = , S_pc = (0, -2): val = 0.10 / pval = 0.0287` (here, \"S_pc\" refers to the standard conditioning set $\\mathcal{S}$ and \"S_def\" to the default conditions $\\mathcal{S}_{def}$). Then set `verbosity = 2` in the default applications of LPCMCI in subsections 3.4 and 3.5 above and see that the same happens in the preliminary phases of these runs. However, in these cases the algorithm restores this edge before moving to the final ancestral phase while it remembers that $X^0_{t-2}$ is a causal ancestor of $X^0_{t-1}$ and that $X^1_{t-1}$ is a causal ancestor of $X^1_{t}$. The final ancestral phase therefore uses $\\mathcal{S}_{def}(X^0_{t-1}, X^0_t) = \\{X^0_{t-2}, X^1_{t-1}\\}$ and never tests whether $X^0_{t-1}$ and $X^1_t$ are conditionally independent given $X^0_{t-2}$. Indeed, search for `ANC(Y): (0, -1) _|_ (1, 0) | S_def = (0, -2) (1, -1), S_pc = : val = 0.19 / pval = 0.0000` in the verbose output.\n",
"The edge between $X^0_{t-1}$ and $X^1_t$ is removed because the (conditional) independence test wrongly judges $X^0_{t-1}$ and $X^1_t$ to be independent conditional on $X^0_{t-2}$. To see this set `verbosity = 1` when creating the LPCMCI object in the previous two cells and search for the line `(0,-1) independent (1, 0) given ((0, -2),) union set()` in the verbose output, or set ``verbosity = 2`` and search for ``ANC(Y): (0, -1) _|_ (1, 0) | S_def = , S_pc = (0, -2): val = 0.10 / pval = 0.0287`` (here, ``S_pc`` refers to the standard conditioning set $\\mathcal{S}$ and ``S_def`` to the default conditions $\\mathcal{S}_{def}$). Then set `verbosity = 2` in the default applications of LPCMCI in subsections 3.4 and 3.5 above and see that the same happens in the preliminary phases of these runs. However, in these cases the algorithm restores this edge before moving to the final ancestral phase while it remembers that $X^0_{t-2}$ is a causal ancestor of $X^0_{t-1}$ and that $X^1_{t-1}$ is a causal ancestor of $X^1_{t}$. The final ancestral phase therefore uses $\\mathcal{S}_{def}(X^0_{t-1}, X^0_t) = \\{X^0_{t-2}, X^1_{t-1}\\}$ and never tests whether $X^0_{t-1}$ and $X^1_t$ are conditionally independent given $X^0_{t-2}$. Indeed, search for ``ANC(Y): (0, -1) _|_ (1, 0) | S_def = (0, -2) (1, -1), S_pc = : val = 0.19 / pval = 0.0000`` in the verbose output.\n",
"\n",
"For `tau_max = 2` it is further wrongly inferred that $X^2_{t-2}$ and $X^1_t$ are adjacent."
]
Expand Down