Skip to content

Commit

Permalink
fix: correctly track sources for open LSP documents (#5561)
Browse files Browse the repository at this point in the history
# Description

## Problem

Resolves #5562

## Summary

I noticed that our LSP code puts unsaved files into an `input_files`
dictionary, but that dictionary was not used when adding files to a
FileManager, meaning that file sources always corresponded to disk
contents, never unsaved content.

Additionally, when we got a notification that a file changed, only that
file (without everything it depended on) was processed, just to get code
lenses. That means that references to types and variables weren't
updated here, so things like hover, go to reference and inlay hints
wouldn't work well.

To fix this, a few things have been done in this PR:
1. When creating a FileManager, always add these unsaved files first.
2. When we get the notification that a file was changed, type-check only
the current package but without producing errors (at least Rust Analyzer
doesn't update the Problems list until you save a file). This is also
how it used to work before this PR.
3. When a file is closed, we also type-check the current package to
update references (to avoid keeping references to that unsaved code
locations).

Type-checking a package on every edit sounds a bit too much... but I
checked a few big crates and type-checking just a single package (in a
workspace of many packages) takes around 300ms on my machine (with a
debug build), so I think it's acceptable. If it's slower then it might
still be good enough (it's worse if things break and you get incorrect
inlay hints or error notifications). Maybe ideally we'd process all
top-level definitions in a package, but only fully type-check (looking
inside function definitions) in the file that just changed, but I
couldn't find an easy way to do that, so that would be an optimization
for later (if really needed).

## Additional Context

## Documentation\*

Check one:
- [x] No documentation needed.
- [ ] Documentation included in this PR.
- [ ] **[For Experimental Features]** Documentation to be submitted in a
separate PR.

# PR Checklist\*

- [x] I have tested the changes locally.
- [x] I have formatted the changes with [Prettier](https://prettier.io/)
and/or `cargo fmt` on default settings.
  • Loading branch information
asterite authored Jul 23, 2024
1 parent da3d607 commit 9e61e97
Show file tree
Hide file tree
Showing 7 changed files with 236 additions and 97 deletions.
18 changes: 17 additions & 1 deletion tooling/lsp/src/lib.rs
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ use lsp_types::{
use nargo::{
package::{Package, PackageType},
parse_all,
workspace::Workspace,
workspace::{self, Workspace},
};
use nargo_toml::{find_file_manifest, resolve_workspace_from_toml, PackageSelection};
use noirc_driver::{file_manager_with_stdlib, prepare_crate, NOIR_ARTIFACT_VERSION_STRING};
Expand Down Expand Up @@ -398,6 +398,22 @@ fn parse_diff(file_manager: &FileManager, state: &mut LspState) -> ParsedFiles {
}
}

pub fn insert_all_files_for_workspace_into_file_manager(
state: &LspState,
workspace: &workspace::Workspace,
file_manager: &mut FileManager,
) {
// First add files we cached: these have the source code of files that are modified
// but not saved to disk yet, and we want to make sure all LSP features work well
// according to these unsaved buffers, not what's saved on disk.
for (path, source) in &state.input_files {
let path = path.strip_prefix("file://").unwrap();
file_manager.add_file_with_source_canonical_path(Path::new(path), source.clone());
}

nargo::insert_all_files_for_workspace_into_file_manager(workspace, file_manager);
}

#[test]
fn prepare_package_from_source_string() {
let source = r#"
Expand Down
267 changes: 183 additions & 84 deletions tooling/lsp/src/notifications/mod.rs
Original file line number Diff line number Diff line change
@@ -1,20 +1,19 @@
use std::ops::ControlFlow;

use crate::insert_all_files_for_workspace_into_file_manager;
use async_lsp::{ErrorCode, LanguageClient, ResponseError};
use nargo::insert_all_files_for_workspace_into_file_manager;
use noirc_driver::{check_crate, file_manager_with_stdlib};
use noirc_errors::{DiagnosticKind, FileDiagnostic};

use crate::requests::collect_lenses_for_package;
use crate::types::{
notification, Diagnostic, DiagnosticSeverity, DidChangeConfigurationParams,
DidChangeTextDocumentParams, DidCloseTextDocumentParams, DidOpenTextDocumentParams,
DidSaveTextDocumentParams, InitializedParams, NargoPackageTests, PublishDiagnosticsParams,
};

use crate::{
byte_span_to_range, get_package_tests_in_crate, parse_diff, prepare_source,
resolve_workspace_for_source_path, workspace_package_for_file, LspState,
byte_span_to_range, get_package_tests_in_crate, parse_diff, resolve_workspace_for_source_path,
LspState,
};

pub(super) fn on_initialized(
Expand All @@ -38,8 +37,15 @@ pub(super) fn on_did_open_text_document(
state.input_files.insert(params.text_document.uri.to_string(), params.text_document.text);

let document_uri = params.text_document.uri;

match process_workspace_for_noir_document(document_uri, state) {
let only_process_document_uri_package = false;
let output_diagnostics = true;

match process_workspace_for_noir_document(
state,
document_uri,
only_process_document_uri_package,
output_diagnostics,
) {
Ok(_) => {
state.open_documents_count += 1;
ControlFlow::Continue(())
Expand All @@ -55,41 +61,19 @@ pub(super) fn on_did_change_text_document(
let text = params.content_changes.into_iter().next().unwrap().text;
state.input_files.insert(params.text_document.uri.to_string(), text.clone());

let file_path = params.text_document.uri.to_file_path().unwrap();

let (mut context, crate_id) = prepare_source(text, state);
let _ = check_crate(&mut context, crate_id, false, false, None);

let workspace = match resolve_workspace_for_source_path(
params.text_document.uri.to_file_path().unwrap().as_path(),
&state.root_path,
let document_uri = params.text_document.uri;
let only_process_document_uri_package = true;
let output_diagnotics = false;

match process_workspace_for_noir_document(
state,
document_uri,
only_process_document_uri_package,
output_diagnotics,
) {
Ok(workspace) => workspace,
Err(lsp_error) => {
return ControlFlow::Break(Err(ResponseError::new(
ErrorCode::REQUEST_FAILED,
lsp_error.to_string(),
)
.into()))
}
};

let package = match workspace_package_for_file(&workspace, &file_path) {
Some(package) => package,
None => {
return ControlFlow::Break(Err(ResponseError::new(
ErrorCode::REQUEST_FAILED,
"Selected workspace has no members",
)
.into()))
}
};

let lenses = collect_lenses_for_package(&context, crate_id, &workspace, package, None);

state.cached_lenses.insert(params.text_document.uri.to_string(), lenses);

ControlFlow::Continue(())
Ok(_) => ControlFlow::Continue(()),
Err(err) => ControlFlow::Break(Err(err)),
}
}

pub(super) fn on_did_close_text_document(
Expand All @@ -105,16 +89,35 @@ pub(super) fn on_did_close_text_document(
state.cached_definitions.clear();
}

ControlFlow::Continue(())
let document_uri = params.text_document.uri;
let only_process_document_uri_package = true;
let output_diagnotics = false;

match process_workspace_for_noir_document(
state,
document_uri,
only_process_document_uri_package,
output_diagnotics,
) {
Ok(_) => ControlFlow::Continue(()),
Err(err) => ControlFlow::Break(Err(err)),
}
}

pub(super) fn on_did_save_text_document(
state: &mut LspState,
params: DidSaveTextDocumentParams,
) -> ControlFlow<Result<(), async_lsp::Error>> {
let document_uri = params.text_document.uri;

match process_workspace_for_noir_document(document_uri, state) {
let only_process_document_uri_package = false;
let output_diagnotics = true;

match process_workspace_for_noir_document(
state,
document_uri,
only_process_document_uri_package,
output_diagnotics,
) {
Ok(_) => ControlFlow::Continue(()),
Err(err) => ControlFlow::Break(Err(err)),
}
Expand All @@ -124,8 +127,10 @@ pub(super) fn on_did_save_text_document(
// it's only contained in a package), then type-checks the workspace's packages,
// caching code lenses and type definitions, and notifying about compilation errors.
pub(crate) fn process_workspace_for_noir_document(
document_uri: lsp_types::Url,
state: &mut LspState,
document_uri: lsp_types::Url,
only_process_document_uri_package: bool,
output_diagnostics: bool,
) -> Result<(), async_lsp::Error> {
let file_path = document_uri.to_file_path().map_err(|_| {
ResponseError::new(ErrorCode::REQUEST_FAILED, "URI is not a valid file path")
Expand All @@ -137,13 +142,23 @@ pub(crate) fn process_workspace_for_noir_document(
})?;

let mut workspace_file_manager = file_manager_with_stdlib(&workspace.root_dir);
insert_all_files_for_workspace_into_file_manager(&workspace, &mut workspace_file_manager);
insert_all_files_for_workspace_into_file_manager(
state,
&workspace,
&mut workspace_file_manager,
);

let parsed_files = parse_diff(&workspace_file_manager, state);

let diagnostics: Vec<_> = workspace
.into_iter()
.flat_map(|package| -> Vec<Diagnostic> {
let package_root_dir: String = package.root_dir.as_os_str().to_string_lossy().into();

if only_process_document_uri_package && !file_path.starts_with(&package.root_dir) {
return vec![];
}

let (mut context, crate_id) =
crate::prepare_package(&workspace_file_manager, &parsed_files, package);

Expand All @@ -152,8 +167,6 @@ pub(crate) fn process_workspace_for_noir_document(
Err(errors_and_warnings) => errors_and_warnings,
};

let package_root_dir: String = package.root_dir.as_os_str().to_string_lossy().into();

// We don't add test headings for a package if it contains no `#[test]` functions
if let Some(tests) = get_package_tests_in_crate(&context, &crate_id, &package.name) {
let _ = state.client.notify::<notification::NargoUpdateTests>(NargoPackageTests {
Expand All @@ -176,46 +189,53 @@ pub(crate) fn process_workspace_for_noir_document(
let fm = &context.file_manager;
let files = fm.as_file_map();

file_diagnostics
.into_iter()
.filter_map(|FileDiagnostic { file_id, diagnostic, call_stack: _ }| {
// Ignore diagnostics for any file that wasn't the file we saved
// TODO: In the future, we could create "related" diagnostics for these files
if fm.path(file_id).expect("file must exist to have emitted diagnostic")
!= file_path
{
return None;
}

// TODO: Should this be the first item in secondaries? Should we bail when we find a range?
let range = diagnostic
.secondaries
.into_iter()
.filter_map(|sec| byte_span_to_range(files, file_id, sec.span.into()))
.last()
.unwrap_or_default();

let severity = match diagnostic.kind {
DiagnosticKind::Error => DiagnosticSeverity::ERROR,
DiagnosticKind::Warning => DiagnosticSeverity::WARNING,
DiagnosticKind::Info => DiagnosticSeverity::INFORMATION,
DiagnosticKind::Bug => DiagnosticSeverity::WARNING,
};
Some(Diagnostic {
range,
severity: Some(severity),
message: diagnostic.message,
..Default::default()
if output_diagnostics {
file_diagnostics
.into_iter()
.filter_map(|FileDiagnostic { file_id, diagnostic, call_stack: _ }| {
// Ignore diagnostics for any file that wasn't the file we saved
// TODO: In the future, we could create "related" diagnostics for these files
if fm.path(file_id).expect("file must exist to have emitted diagnostic")
!= file_path
{
return None;
}

// TODO: Should this be the first item in secondaries? Should we bail when we find a range?
let range = diagnostic
.secondaries
.into_iter()
.filter_map(|sec| byte_span_to_range(files, file_id, sec.span.into()))
.last()
.unwrap_or_default();

let severity = match diagnostic.kind {
DiagnosticKind::Error => DiagnosticSeverity::ERROR,
DiagnosticKind::Warning => DiagnosticSeverity::WARNING,
DiagnosticKind::Info => DiagnosticSeverity::INFORMATION,
DiagnosticKind::Bug => DiagnosticSeverity::WARNING,
};
Some(Diagnostic {
range,
severity: Some(severity),
message: diagnostic.message,
..Default::default()
})
})
})
.collect()
.collect()
} else {
vec![]
}
})
.collect();
let _ = state.client.publish_diagnostics(PublishDiagnosticsParams {
uri: document_uri,
version: None,
diagnostics,
});

if output_diagnostics {
let _ = state.client.publish_diagnostics(PublishDiagnosticsParams {
uri: document_uri,
version: None,
diagnostics,
});
}

Ok(())
}
Expand All @@ -226,3 +246,82 @@ pub(super) fn on_exit(
) -> ControlFlow<Result<(), async_lsp::Error>> {
ControlFlow::Continue(())
}

#[cfg(test)]
mod notification_tests {
use crate::test_utils;

use super::*;
use lsp_types::{
InlayHintLabel, InlayHintParams, Position, TextDocumentContentChangeEvent,
TextDocumentIdentifier, TextDocumentItem, VersionedTextDocumentIdentifier,
WorkDoneProgressParams,
};
use tokio::test;

#[test]
async fn test_caches_open_files() {
let (mut state, noir_text_document) = test_utils::init_lsp_server("inlay_hints").await;

// Open the document, fake the text to be empty
on_did_open_text_document(
&mut state,
DidOpenTextDocumentParams {
text_document: TextDocumentItem {
uri: noir_text_document.clone(),
language_id: "noir".to_string(),
version: 0,
text: "".to_string(),
},
},
);

// Fake the text to change to "global a = 1;"
on_did_change_text_document(
&mut state,
DidChangeTextDocumentParams {
text_document: VersionedTextDocumentIdentifier {
uri: noir_text_document.clone(),
version: 1,
},
content_changes: vec![TextDocumentContentChangeEvent {
range: None,
range_length: None,
// Should get an inlay hint for ": bool" after "a"
text: "global a = true;".to_string(),
}],
},
);

// Get inlay hints. These should now be relative to the changed text,
// not the saved file's text.
let inlay_hints = crate::requests::on_inlay_hint_request(
&mut state,
InlayHintParams {
work_done_progress_params: WorkDoneProgressParams { work_done_token: None },
text_document: TextDocumentIdentifier { uri: noir_text_document },
range: lsp_types::Range {
start: lsp_types::Position { line: 0, character: 0 },
end: lsp_types::Position { line: 1, character: 0 },
},
},
)
.await
.expect("Could not execute on_inlay_hint_request")
.unwrap();

assert_eq!(inlay_hints.len(), 1);

let inlay_hint = &inlay_hints[0];
assert_eq!(inlay_hint.position, Position { line: 0, character: 8 });

if let InlayHintLabel::LabelParts(labels) = &inlay_hint.label {
assert_eq!(labels.len(), 2);
assert_eq!(labels[0].value, ": ");
assert_eq!(labels[0].location, None);
assert_eq!(labels[1].value, "bool");
} else {
panic!("Expected InlayHintLabel::LabelParts, got {:?}", inlay_hint.label);
}
}
}
Loading

0 comments on commit 9e61e97

Please sign in to comment.