Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request] Allow accessing the filesystem in scripts and bundle/expose the fs modules. #306

Open
Rzpeg opened this issue Oct 3, 2023 · 52 comments

Comments

@Rzpeg
Copy link

Rzpeg commented Oct 3, 2023

Please include the fs modules (fs, fs-extra) in the distribution and allow accessing the filesystem.
This is needed to generate attachments from files in pre-request scripts

@mtHuberty
Copy link

mtHuberty commented Oct 3, 2023

I'm interested in this too. I believe Anoop mentioned in this other thread that it's not yet allowed for security reasons, but that the goal would be to allow users to explicitly enable access.

My use case is that in my pre-request script, I want to read a token that is stored/updated in a specific file on my filesystem to make http calls to a Vault instance, to then retrieve secrets for use in fetching OAuth tokens that I can use in my requests.

If there are any updates on this, please let us know. If you're comfortable opening this up to a contribution, I'd be happy to investigate and make an attempt at a PR.

Edit: I tried to implement a work around locally, but ran into issues using the node-vault package, which apparently internally depends on the node tty package. (https://nodejs.org/api/tty.html). I'm tempted to ask if we can enable it like Anoop suggested we could enable stream, path, url, and util in the issue I linked above, but I'm afraid we might end up in a game of infinite whack-a-mole where there are constantly packages to be whitelisted. Instead, I might suggest that we allow users to specifically enable/disable certain built-in packages in their VM, or even allow an option for running the VM with a host context instead of sandbox, perhaps with an appropriate warning about the security implications?

@pove
Copy link

pove commented Oct 4, 2023

Could be secure to access files inside collections? I mean, having a module that only has access to files stored on the same path that the .bru files.

@helloanoop
Copy link
Contributor

There are two access that we need to consider when executing a script inside vm2

  1. Network Access
  2. Filesystem Access

We decided to allow network access to the script by default since it is a really common thing that people who use Bruno need.
In the future, we will provide a toggle to disable network access in bruno.json

We decided to disable filesystem access by default.
The user can however enable it by adding the following to bruno.json

{
  "filesystemAccess": {
    "allow": true
  }
}

Could be secure to access files inside collections? I mean, having a module that only has access to files stored on the same path that the .bru files.

Yes, we will have to build a wrapper around fs before passing to vm2

@helloanoop
Copy link
Contributor

Just got this completed. Cutting a new release now.

@helloanoop
Copy link
Contributor

Available in GUI v0.19.0
Not yet available in CLI

Go give it a whirl and let me know if your case is solved.

image

@Rzpeg
Copy link
Author

Rzpeg commented Oct 5, 2023

@helloanoop
Amazing. My case is solved, as I can now do 2 things I have been missing:

  1. read an AuthToken from a file and use it.
  2. generate attachments, by reading binary files directly into a base64, setting the request body and the appropriate Content-Disposition headers.

Now need this supported in CLI :)

Thanks!

@helloanoop
Copy link
Contributor

@Rzpeg Can you share your script. You can replace the URLs and other things for privacy. Your script would be super helpful for the community to who want to do file uploads using bruno.

@Rzpeg
Copy link
Author

Rzpeg commented Oct 5, 2023

@helloanoop Sure, here it goes:

const fs = require('fs');
const path = require('path');

const attachmentFilename = "debug-data.bin";
const attachmentPath = path.join(bru.cwd(), attachmentFilename);
const attachment = fs.readFileSync(attachmentPath, "base64");
const attachmentLength = attachment.length;

req.setHeader("Content-Type", "application/octet-stream");
req.setHeader("Content-Disposition", `attachment; filename="${attachmentFilename}"`);
req.setHeader("Content-Length", attachmentLength);

req.setBody(attachment);

@Rzpeg
Copy link
Author

Rzpeg commented Oct 6, 2023

@helloanoop
Does the CLI already support running scenarios that use fs module?

Edit: Just did a test run on CLI 0.12.0, and not supported yet.

@eferro70
Copy link

eferro70 commented Oct 6, 2023

Great job @Rzpeg. I have tried this changing "application/octet-stream" for "multipart/form-data" but does not work.

@Rzpeg
Copy link
Author

Rzpeg commented Oct 6, 2023

Great job @Rzpeg. I have tried this changing "application/octet-stream" for "multipart/form-data" but does not work.

This is because multipart/form-data content type has different body structure / specification.

You need to define boundaries and construct request the proper manner:
https://www.w3.org/TR/html401/interact/forms.html#h-17.13.4.2

@helloanoop
Copy link
Contributor

@eferro70 Once you figure how to solve your use case, it wou6be great if you could share it in Scriptmania - #385

@jonasgheer
Copy link

Hello, I got here via this issue. Please tell me if I should rather open a new ticket, but I found this thread quite relevant.

My problem:
I am attempting to access secrets in Azure Key Vault using @azure/keyvault-secrets and @azure/identity in a pre-script, but I am facing module issues:
Error invoking remote method 'send-http-request': VMError: Cannot find module 'os'. I managed to sneak past this one by installing this exported version, but then only to face the next one in line:
Error invoking remote method 'send-http-request': VMError: Cannot find module 'crypto'.

What is the current plan on resolving issues like this? Maybe a list of whitelisted packages from node could be included in bruno.json?

@helloanoop
Copy link
Contributor

What is the current plan on resolving issues like this? Maybe a list of whitelisted packages from node could be included in bruno.json?

Yes, I agree with the whitelisting approach.. moduleWhitelist

Also curious to know which tool were you using before Bruno :) And how were you managing secrets before.

@Rzpeg
Copy link
Author

Rzpeg commented Oct 6, 2023

@helloanoop whitrlisting is a neat idea.
Also, while you are here, do you happen to have any ETA for fs support in CLI?

@helloanoop
Copy link
Contributor

2 hrs :)

@Rzpeg
Copy link
Author

Rzpeg commented Oct 6, 2023

That's... At least 2 days quicker than anticipated. :)
Take a weekend and... REST :P

@jonasgheer
Copy link

What is the current plan on resolving issues like this? Maybe a list of whitelisted packages from node could be included in bruno.json?

Yes, I agree with the whitelisting approach.. moduleWhitelist

Also curious to know which tool were you using before Bruno :) And how were you managing secrets before.

Awesome! In all honesty my current workflow is the good old copy/paste, which I'm trying to get away from. I am currently using Insomnia a bit on and off. There is an azure key vault plugin that I didn't get to work last time I tried (my fault no doubt). I stumbled across Bruno today and I got curious if I could get it working 😄

@helloanoop
Copy link
Contributor

@jonasgheer @Rzpeg I have published Bru CLI v0.13.0 and Bru GUI v0.21.0

These have support for whitelisting and filesystem access. There is a breaking change in config

{
  "version": "1",
  "name": "bruno-testbench",
  "type": "collection",
  "scripts": {
    "moduleWhitelist": [
      "crypto"
    ],
    "filesystemAccess": {
      "allow": true
    }
  }
}

image

Let me know if you face any further issues. Also @jonasgheer Once you get Azure Key Vault working, please share your script for the community reference in Scriptmania - #385

Let's keep this ticket open. Need to update docs.

@jonasgheer
Copy link

Oh wow, that was quick! Unfortunately I'm still facing some issues. After whitelisting "os" and "crypto" I ran into this error:
Error: Error invoking remote method 'send-http-request': VMError: Operation not allowed on contextified object.. I had a go at grabbing the VM code in an attempt to narrow down the problem (on the way I even bumped into an issue you filed on the vm2 repo, so thanks for that one 😄). Unfortunately the only thing I managed to conclude is that the script crashes somewhere in the require call: const { SecretClient } = require("@azure/keyvault-secrets");. From what I have gathered I'm guessing the code in there tries to manipulate something it's not allowed to.

Granted, the module requires that you have the azure cli installed, and that you are logged in. So I'm not even sure if it really possible to have it run in a somewhat "safe" manner? I'm not sure if these sorts of modules more or less require full blown host access or not 🤔 I'm flailing a bit in the dark here. My limited knowledge on both vm2 and @azure/keyvault-secrets had me a bit stumped.

@helloanoop

@jonasgheer
Copy link

Oh wow, that was quick! Unfortunately I'm still facing some issues. After whitelisting "os" and "crypto" I ran into this error: Error: Error invoking remote method 'send-http-request': VMError: Operation not allowed on contextified object.. I had a go at grabbing the VM code in an attempt to narrow down the problem (on the way I even bumped into an issue you filed on the vm2 repo, so thanks for that one 😄). Unfortunately the only thing I managed to conclude is that the script crashes somewhere in the require call: const { SecretClient } = require("@azure/keyvault-secrets");. From what I have gathered I'm guessing the code in there tries to manipulate something it's not allowed to.

Granted, the module requires that you have the azure cli installed, and that you are logged in. So I'm not even sure if it really possible to have it run in a somewhat "safe" manner? I'm not sure if these sorts of modules more or less require full blown host access or not 🤔 I'm flailing a bit in the dark here. My limited knowledge on both vm2 and @azure/keyvault-secrets had me a bit stumped.

@helloanoop

I might be making some progress here, hold your horses 😄

@jonasgheer
Copy link

I have identified some problems. I'll start with the biggest one, which sadly might make this whole thing a no-go.

@azure/keyvault-secrets pulls in 3 libraries amongst a bunch of others:

These packages all have a line of code in them calling util.inherits:

I found a related issue in the vm2 repo (it links further to other related issues). And as far as I have collected I guess the conclusion is that the internal node modules are not "real" but they are proxied (I'm on thin ice here) and as a result you are not allowed to extend other objects using them. This is as far as I can tell "Known issue" nr 2 in the readme:

It is not possible to define a class that extends a proxied class. This includes using a proxied class in Object.create.

I guess this might be something to pick up again after the project have moved to isolated-vm? Unless I have completed missed the mark here and there is an easier solution just in front of me that I can't think of 😅

@helloanoop
Copy link
Contributor

I guess this might be something to pick up again after the project have #263?

Yes, that makes sense.

Now, that the CLI route didn't work, I recommend trying the API route.
I created a separate ticket #454 to track this on doing it via API

@jonasgheer Can you take a stab at it?

@DivyMohan14
Copy link
Contributor

Makes sense @helloanoop let me have a look at this ...

@helloanoop
Copy link
Contributor

@DivyMohan14 I confirm that writing indeed works

const fs = require('fs');
const path = require('path');
const filepath = path.join(bru.cwd(), 'log.txt');

fs.writeFileSync(filepath, JSON.stringify(res.getBody()));

@DivyMohan14 When you have some time, please see if you are able to implement streaming event listening on the res object

We should be then be able to use something like this

let responseData = '';

res.on('data', (chunk) => {
    responseData += chunk;
});

res.on('end', () => {
    try {
        let jsonData = JSON.parse(responseData);

        // write to file
    } catch (error) {
    }
});

@DivyMohan14
Copy link
Contributor

DivyMohan14 commented Oct 11, 2023

I did the setup that @Xceron might have, and yeah indeed the zip file write does not work as expected, and as you said @helloanoop the normal write works as expected..

The problem here does not seem to be with the file system or the write access, it looks to be some issue with the response configuration
I am on it expecting to find the root cause in sometime

@DivyMohan14
Copy link
Contributor

Looks like I found the issue, the issue is related to axios returning binary data, dealing with seems to be an issue an easier solution is to get the response as arrayBuffer instead...

Wrote a logic in prepare request to check if the Content-Type is application/zip then set responseType as arrayBuffer in axios config

A good resource for the issue: link here

@helloanoop please have a look at PR

@Xceron you can try to run the server from my branch in the meantime to check if this solves the issue ?

@Xceron
Copy link

Xceron commented Oct 11, 2023

@DivyMohan14 thank you for the quick PR!
I've tried running your version but still run into the same issue with my code mentioned here. What is your post response script to download the zip?

@DivyMohan14
Copy link
Contributor

DivyMohan14 commented Oct 11, 2023

@Xceron
Interesting, I was running the same post request script that you are using. can you maybe send the server code which you are running I had setup a mock server which streams a zip file.

Btw in the request headers did you mention application/zip as Content-Type to cross check

My zip file stream mock setup

var express = require('express')
var router = express.Router()
var path = require('path')
var fs = require('fs')

router.get("/", async(req, res, next) => {

    let filepath = path.join("png.zip")
    let stat = fs.statSync(filepath)
    console.log("request received")
    res.set({
        "Content-Type": "application/zip",
        "Content-Length": stat.size,
        "File-Name": `${path.basename(filepath)}`
    })
    var readStream = fs.createReadStream(filepath)
    readStream.pipe(res)
})
module.exports = router 

@Xceron
Copy link

Xceron commented Oct 11, 2023

The server is a non-public Spring Boot (Java) Server, so I cannot share it, sadly.

The headers of the response are content-type: application/zip and transfer-encoding:chunked.

I will try and set up your mock server to test. I'm not a JS guy, so it'll take some time. Will report back if I manage to pinpoint the problem.

@DivyMohan14
Copy link
Contributor

DivyMohan14 commented Oct 11, 2023

Cool no issues, in the PR if you noticed I was using 'Content-Type' as the identifier.

So in bruno I believe in the header you are setting Content-Type as application/zip , if not can you please check after setting this ?
And can you confirm one more thing in the response that you are seeing in the response tab is it still binary data for you ?

@Xceron
Copy link

Xceron commented Oct 11, 2023

Don't know whether I can follow you, let me explain:

My request is a file upload (which I do using the pre-request script shared here), which takes multiple files as input and uploads it to a server. The server uses those files, does some things and sends back a .zip.
I do not set the request headers manually (or at all), but tried setting content-type: application/zip and content-type: multipart/form-data in the request to no success in both cases.

The files which the server receives are correct and work perfectly fine. So the request side from Bru work without any problem. The server then sends back the .zip and sets content-type:application/zip in the response. Bru shows the binary data in the reponse pane. The binary data is also mostly correct, as I can see the correct file names every time. The zip is just missing some bytes to be a valid zip

@DivyMohan14
Copy link
Contributor

You run the script you mentioned here, post you get the response from the server right.. it said post response script ..?

I will share my bru file here...

meta {
  name: Test zip file download issue
  type: http
  seq: 2
}

get {
  url: http://127.0.0.1:3002/
  body: none
  auth: none
}

headers {
  Content-Type: application/zip
}

script:post-response {
  const fs = require('fs');
  const path = require('path');
  const buffer = Buffer.from(res.getBody(), 'binary');
  const zipFilename = "output.zip";
  const zipPath = path.join(bru.cwd(), zipFilename);
  const contentLength = res.headers['content-length'];
  if (buffer.length !== parseInt(contentLength)) {
    throw new Error('Downloaded content length does not match the expected content length.');
  }
  fs.writeFileSync(zipPath, buffer);
}

can you also share your bru file if possible ? will be easier to debug..

@Xceron
Copy link

Xceron commented Oct 11, 2023

meta {
  name: Upload Files and zip them
  type: http
  seq: 11
}

post {
  url: http://127.0.0.1:8081/upload_files
  body: multipartForm
  auth: none
}

body:form-urlencoded {
  firstFile: undefined
  secondFile: undefined
}

script:pre-request {
  const fs = require('fs');
  const path = require('path');
  const FormData = require('form-data');
  
  const firstFile = "first.txt";
  const secondFile = "second.txt";
  
  const firstPath = path.join(bru.cwd(), firstFile);
  const secondPath = path.join(bru.cwd(), secondFile);
  const form = new FormData();
  form.append('firstFile', fs.createReadStream(firstPath), firstFile);
  form.append('secondFile', fs.createReadStream(secondPath), secondFile);
  req.setBody(form);
}

script:post-response {
  const fs = require('fs');
  const path = require('path');
  const buffer = Buffer.from(res.getBody(), 'binary');
  const zipFilename = "output.zip";
  const zipPath = path.join(bru.cwd(), zipFilename);
  const contentLength = res.headers['content-length'];
  if (buffer.length !== parseInt(contentLength)) {
    throw new Error('Downloaded content length does not match the expected content length.');
  }
  fs.writeFileSync(zipPath, buffer);
}

@DivyMohan14
Copy link
Contributor

DivyMohan14 commented Oct 11, 2023

Yeah can you please copy this headers section from my bru file into yours ? and then try once

@Xceron
Copy link

Xceron commented Oct 11, 2023

Just tried that, still running into the Error

@DivyMohan14
Copy link
Contributor

DivyMohan14 commented Oct 11, 2023

In the response section are you still seeing the response in binary format? if yes then it is a problem it should have changed to response array buffer if you are running from my branch

@Xceron
Copy link

Xceron commented Oct 11, 2023

No, with the headers set I just run into the error and get no response.

@DivyMohan14
Copy link
Contributor

DivyMohan14 commented Oct 11, 2023

Got it thanks let me check, one thing maybe the transfer-encoding chunked might be a differentiator here.. did not look at that

@DivyMohan14
Copy link
Contributor

can you share the response headers @Xceron ? will check that

@Xceron
Copy link

Xceron commented Oct 11, 2023

Name Value
content-type application/zip
transfer-encoding chunked
date Wed, 11 Oct 2023 12:46:22 GMT
connection close

@DivyMohan14
Copy link
Contributor

DivyMohan14 commented Oct 11, 2023

@Xceron as you can see if the transfer-encoding is chunked, then the server will not return the content-length
so it will technically always fail...
can you please remove that check and run the script again if you have not already done it in this new branch ?

@Xceron
Copy link

Xceron commented Oct 11, 2023

If I just remove the check, the request will go through, I see binary outputs in the response, but the written zip is still corrupt. This happens with headers set and with headers not set.

@DivyMohan14
Copy link
Contributor

DivyMohan14 commented Oct 11, 2023

Cool thanks, Can you help me with some questions?

  1. Does the endpoint you hit to merge the text files and get the zip return the data directly which is a zip file or streams the data, given that transfer-encoding: chunked
  2. If the header was set correctly you should see response like
    { "0":80,"1":75,"2":3,"3":4,"4":20 } basically as buffer, which I believe is not happening for you, you are still seeing binary response ?

@Xceron
Copy link

Xceron commented Oct 11, 2023

The server uses Spring's InputStreamResource to send the read file in chunks, so it streams it

I don't get any buffer. Could this be related to the file size? The .zip in my tests are tiny (~3 KB) and the requests finish instantly

@DivyMohan14
Copy link
Contributor

Hmm interesting I also set up a server which streams zip files but the same setup seems to be working fine for me.

@mupakoz
Copy link

mupakoz commented Jan 15, 2024

@Xceron when trying to run your code I get an error: "Error invoking remote method 'send-http-request': VMError: Cannot find module 'form-data'". Any idea what is wrong?

@jaybhanushali3195
Copy link

Hi Everyone, I am facing the same error any idea how to resolve
@Xceron @helloanoop

I get the following error when trying to pass zip file

Error invoking remote method 'send-http-request': VMError: Operation not allowed on contextified object.

This is my script

const fs = require('fs');
const path = require('path');
const FormData = require("form-data");
console.log("hello")
const attachmentFilename = "current_request.bin";
const attachmentPath = path.join(bru.cwd(), attachmentFilename);
const attachment = fs.readFileSync(attachmentPath, "base64");
// const attachmentLength = attachment.length;

const formData = new FormData();
formData.append('doc', attachment, { filename: attachmentFilename }); 
req.setHeader("Content-Type", "application/octet-stream");
req.setHeader("Content-Encoding", "gzip");

req.setBody(formData);

bruno.json

{
    "version": "1",
    "name": "JayTrial",
    "type": "collection",
    "moduleWhitelist": ["form-data"],
    "scripts": {
        "filesystemAccess": {
            "allow": true
        }
    }
}

package.json

{
  "name": "JayTrial",
  "version": "1.0.0",
  "main": "index.js",
  "dependencies": {
    "form-data": "^4.0.0"
  }
}

Any idea how to solve ?

@CedricSodira
Copy link

Hi @jaybhanushali3195,

Just faced the same issue. Seems you're missing the line

   "scripts": {
    "moduleWhitelist": ["form-data"], // white list form-data
    "filesystemAccess": {
      "allow": true
    }
  }

in your package.json.

Can thanks #1346

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests