-
Notifications
You must be signed in to change notification settings - Fork 33
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Dependency Management Phase 2]: Github Actions/Workflows RFC #206
Conversation
a9478ba
to
5a00e87
Compare
c1ed511
to
4bf1db2
Compare
4bf1db2
to
a1ae539
Compare
### Updating Dependencies Manually | ||
All of these steps will run as a workflow with the option for a manual | ||
dispatch, but can be run easily manually on a local system. Additionally, we | ||
could eventually create a script inside of the `<buildpack>/scripts` directory | ||
to perform the main steps of the `buildpack.toml` update process locally. This | ||
will replace the need for the `jam update-dependencies` command that the | ||
automation uses currently. Since dependency-related logic will live alongside | ||
the buildpack, it makes more sense for manual dependency-update scripts to live | ||
there as well in order to leverage code. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am a little curious why we would want to write a new script instead of just modifying the functionality of jam update-dependencies
to do whatever that script would do, or even potentially adding some new command if you want to keep the previous generations functionality. It would make sense to me to try and keep jam
as a Paketo buildpack utility tool and if this manual run could be addressed by a generic script is seems to follow to me that we should just make our tool do that script.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@ForestEckhardt The jam
command now is kind of a bridge between a buildpack and the dep-server, which won't be needed in the new plan. A lot of the core functionality of the jam
command and dep-server will be factored out into code that lives in the buildpack dependency code (in Phase 1), so the job of the command will be very simple leveraging logic from the buildpack itself. Whatever script or command we end up having will simply be gluing together pieces of code from the buildpack dependency code in Phase 1, rather than containing any actual depdency-specific logic if that makes sense?
Overall, the new script will be very simple. jam update-dependencies
to reach the dep-server, diff versions against the buildpack.toml
, and then update the buildpack.toml
in one fell swoop, which in the new world will be separated out. Sure, this could go in jam
if we want it to, but it'll be very simple and to me makes more sense being a simple bash script to run everything.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'll point you to an example of what I mean when I have one.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I guess I just think it would be better to keep as many buildpack development tool in jam
as possible to hopefully help spread this way of doing dependencies but I am not tied to the idea.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A lot of the functionality of jam update-dependencies
is spread out into smaller, composable pieces so that they can be plugged in with other functionality in order to support a bunch of different use cases.
I'd say that the "Assemble" step (see step 9 in the overview) might be the best candidate for pushing into jam update-dependencies
instead of a Github action. It would be a command that just knows how to update the buildpack.toml
given metadata. I'm not particularly married to it being an action or a jam
command really
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I’d say, better to be an action if that step should run in a container, better to put in jam
if it’ll “just work” on any CLI users local workstation, given the appropriate inputs.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
alright seeing as two tooling maintainers are both on board with the metadata update step as a part of jam
, I have updated the RFC to reflect that. It does not need to run inside of a container specifically, so it probably doesn't need to be an action.
5. Test the dependency (whether compiled or not) using the test from | ||
`<buildpack>/dependency/test` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If we are using dependencies directly from upstream do we need to test them? In most cases, where we aren't compiling it seems like the workflow should be as simple as generating metadata and PRing buildpack.toml
. If we need to test anything we could run those tests as a gate on the PR.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It may not be necessary to test dependencies coming directly from upstream. The POC as-written (https://github.com/joshuatcasey/bundler/blob/main/.github/workflows/dependencies.yml) does combine the compile and test steps so that test
only runs after compile
to test the compiled artifact. That being said, the buildpack author could choose a test strategy that is simply exit 0
so it always passes.
One use-case for testing upstream dependencies is to enforce a particular structure on the upstream dependency in case it's been known to change. Of course, if an upstream dependency changed its structure without warning the Bump Versions PR
would likely fail integration tests, so unless we can think of some exceptional case it's likely we only need to test the compiled artifact.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's leave this up to maintainers to decide in non-compiled cases. I've update the RFC to reflect that.
5. In the same matrix loop, dependency compilation will be kicked off with the | ||
code from the buildpacks on the basis of whether the metadata contains the | ||
`SHA256` and `URI` of the dependency. | ||
|
||
6. A smoke test will be run against the dependency in both the | ||
compiled/non-compiled cases via a step that simply runs the test from | ||
`<buildpack>/dependency/test`. | ||
|
||
7. (If compiled) The dependency will be uploaded via an action | ||
|
||
8. (If compiled) The metadata is updated with the SHA256 and GCP bucket URI in | ||
a workflow step |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Have we considered making this a separate workflow that can be used in the rare cases where it is needed? Then the update/metadata workflow could be the same in all cases, the only difference being it looks for our compiled dependency instead of an upstream artifact.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not sure I understand why this should be a separate workflow. I think it can pretty easily just be coupled with the compilation job in the workflow, such that if compilation/artifact upload step runs, then the SHA256 and URI are just updated in the metadata. It seems pretty coupled, I can't wrap my head around why we'd need a separate mechanism to do this.
The other steps such as original metadata generation, and the buildpack.toml
update will be the same in every case
a77bd52
to
c991777
Compare
c991777
to
768895b
Compare
@dmikusa-pivotal the latest commit adds some generalization around where dependency data is stored so that we aren't locked into the |
@paketo-buildpacks/steering-committee @paketo-buildpacks/dependencies @robdimsdale @ForestEckhardt @fg-j anything needed here? Looking for comments or approvals! |
87d462b
to
a97aa99
Compare
Summary
Readable
Outlines the new process for discovering and updating dependencies.
Use Cases
Checklist