-
Notifications
You must be signed in to change notification settings - Fork 704
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Remove event-stream
dependency in favor of plain Buffers
#1554
Conversation
src/plugins/irc-events/link.js
Outdated
.pipe(es.wait(function(err, data) { | ||
if (err) { | ||
return cb(null); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
end
's callback has no err
argument, so I just removed this. Maybe we need to do this in .on("error", function() {})
above instead?
703a7dc
to
754debb
Compare
@astorije I am looking at |
Sorry, I'm not sure what you mean. Are you talking about never caring about buffers ourselves? |
Something like this? ──────────────────────────────────────────
modified: src/plugins/irc-events/link.js
──────────────────────────────────────────
@ link.js:145 @ function emitPreview(client, msg, preview) {
function fetch(uri, cb) {
let req;
- let result = new Buffer(0);
try {
req = request.get({
url: uri,
@ link.js:172 @ function fetch(uri, cb) {
length += data.length;
if (length > limit) {
req.response.req.abort();
- } else {
- result = Buffer.concat([result, data]);
}
})
.on("end", () => {
@ link.js:191 @ function fetch(uri, cb) {
}
cb({
- data: result,
+ data: req.response.body,
type: type,
size: size
});
``` |
@astorije Quite possibly like that, but I think |
So, I can't seem to access the |
I just tried it myself, and I can't access it either. However I changed the buffering code a little to improve the perf:
And it also no longer drops the last received chunk (it would abort in that, but not buffer it). diff --git a/src/plugins/irc-events/link.js b/src/plugins/irc-events/link.js
index 06049b1..ef73dc5 100644
--- a/src/plugins/irc-events/link.js
+++ b/src/plugins/irc-events/link.js
@@ -5,7 +5,6 @@ const request = require("request");
const url = require("url");
const Helper = require("../../helper");
const findLinks = require("../../../client/js/libs/handlebars/ircmessageparser/findLinks");
-const es = require("event-stream");
const storage = require("../storage");
process.setMaxListeners(0);
@@ -143,6 +142,7 @@ function emitPreview(client, msg, preview) {
function fetch(uri, cb) {
let req;
+
try {
req = request.get({
url: uri,
@@ -155,8 +155,11 @@ function fetch(uri, cb) {
} catch (e) {
return cb(null);
}
+
+ const buffers = [];
var length = 0;
var limit = Helper.config.prefetchMaxImageSize * 1024;
+
req
.on("response", function(res) {
if (!(/^image\/.+/.test(res.headers["content-type"]))) {
@@ -166,18 +169,15 @@ function fetch(uri, cb) {
}
})
.on("error", function() {})
- .pipe(es.map(function(data, next) {
+ .on("data", (data) => {
length += data.length;
+ buffers.push(data);
+
if (length > limit) {
- req.response.req.abort();
- }
- next(null, data);
- }))
- .pipe(es.wait(function(err, data) {
- if (err) {
- return cb(null);
+ req.abort();
}
-
+ })
+ .on("end", () => {
if (req.response.statusCode < 200 || req.response.statusCode > 299) {
return cb(null);
}
@@ -193,14 +193,12 @@ function fetch(uri, cb) {
type = req.response.headers["content-type"].split(/ *; */).shift();
}
- data = {
- data: data,
+ cb({
+ data: Buffer.concat(buffers, length),
type: type,
size: size
- };
-
- cb(data);
- }));
+ });
+ });
}
// https://github.com/request/request/issues/2120 |
Good call, will do that. |
This feels like quite a fundamental thing to put into 2.5.0 so close to release, no? We are so close to it. |
@YaManicKill It's a simple change, really. I have tested it already with the diff I posted above. |
I'm perfectly fine with not having it in 2.5.0 (what I intended originally), for what it's worth. Or if you feel like it should, I'm cool with it too. There will be a RC phase anyway and this change is really a "works fine or completely breaks" kind of thing, but really I'm fine with whatever :) |
754debb
to
2233e5b
Compare
Alright, applied your changes @xPaw and it works well. Also, using the following: ───────────────────────────────────────────
modified: src/plugins/irc-events/link.js
───────────────────────────────────────────
@ link.js:171 @ function fetch(uri, cb) {
.on("error", function() {})
.on("data", (data) => {
if (length + data.length > limit) {
+ console.log("aborting, image too big!");
req.abort();
} else {
+ console.log("downloading image, current length:", length);
length += data.length;
buffers.push(data);
}
})
.on("end", () => {
+ console.log("download over, response code:", req.response.statusCode);
if (req.response.statusCode < 200 || req.response.statusCode > 299) {
return cb(null);
} I could verify that both When successfully downloading an image:
When an image is too big:
|
src/plugins/irc-events/link.js
Outdated
req.abort(); | ||
} else { | ||
length += data.length; | ||
buffers.push(data); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You did this differently than in my diff, you're just throwing away last received buffer, remove the else
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you're just throwing away last received buffer
What do you mean? It doesn't matter if we push or not the last chunk, since the request gets aborted and the picture not rendered, does it?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This code is used to fetch HTML too. You're downloading data and throwing it away for no good reason which is a regression compared to previous code.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
On that note, we should rename prefetchMaxImageSize
into prefetchMaxSize
or something, if it's used for non-images too.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It sets the limit to 50kb once it detects a non image.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done.
2233e5b
to
75b927b
Compare
Right, I forgot about that. I shouldn't commit at 4am :) |
@YaManicKill, I'll let you review and decide if you want this in 2.5.0 or later. I won't be offended in any way, do as you prefer :) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Alright, yeah I guess this makes sense. Let's go for it.
event-stream
dependency in favor of plain Buffers
event-stream
dependency in favor of plain BuffersBuffers
Buffers
event-stream
dependency in favor of plain Buffers
We were not making real use of
event-stream
apart for a minor convenience so might as well not rely on it (and its 7 dependencies).I am not totally familiar with
Buffer
s so I might not be 100% correct, but it does work just as expected and tests pass. 🤷♂️