-
Notifications
You must be signed in to change notification settings - Fork 30
feat(api)!: add support for pipelines #615
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
… into feature/database/pipeline
… into feature/api/pipeline
Codecov Report
@@ Coverage Diff @@
## master #615 +/- ##
==========================================
+ Coverage 55.22% 55.47% +0.25%
==========================================
Files 196 195 -1
Lines 15781 15741 -40
==========================================
+ Hits 8715 8733 +18
+ Misses 6704 6644 -60
- Partials 362 364 +2
|
cognifloyd
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It might be large, but it is repetitive (follows a pattern), so it wasn't that bad to review. Thanks for the overview. Looks great to me!
wass3r
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nice work. first pass - see comments
one additional nitpick.. not sure if it's just me, but i'd almost like to see :pipeline changed to :sha in the paths (eg. /api/v1/pipelines/:org/:repo/:pipeline) or something. maybe that's leftover from when you used IDs? for documentation purposes that seems more clear to me.
| "pipeline": p.GetCommit(), | ||
| "repo": r.GetName(), | ||
| "user": u.GetName(), | ||
| }).Infof("compiling pipeline %s", entry) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
with the given fields, the contents of %s don't provide any new information. should we drop it?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We certainly could 👍 I'm leaning towards leaving it in there since it doesn't hurt.
This pattern is adopted from existing log entires we already established for other API endpoints.
So if we did want to switch this up, I think we'd have to go back and apply the same changes for consistency.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fair point, I'd vote for removing it maybe in another PR. Seems like a waste of bytes that can add up - then again, this is more compact than logging as separate JSON fields :disappear:
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah I see where you're coming from on that 👍
To me, the message for the log entry is designed for human processing
i.e. user who manually reviews the log entries or using a tool such as grep or awk
In those scenarios, it can add a bit of complexity when trying to search for multiple fields but not impossible:
# searching for specific message
cat log.txt | grep '<org>/<repo>/<commit>'
# searching for specific fields
cat log.txt | grep '"org": <org>' | grep '"repo": <repo>' | grep '"pipeline": <commit>'While the fields for the log entry are designed for system processing
i.e. most apps/tools consuming logs enable filtering on one or more fields which improves this experience
However, if folks feel strongly enough about it, then I could get behind shortening the message
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thanks for addressing feedback. i see there were no hot takes on :pipeline vs :sha or something similar in URL path for docs. it's not crucial, just something I felt took extra brain cycles to understand what is expected, though the description for the param define it more.
I'll let this comment sit for a bit to allow for response or others to chime in. otherwise 👍🏼
|
@wass3r I think I could get behind that but I'd want to encourage that as a separate PR 👍 The reason for that is because it would divert from the existing pattern we have today. e.g.
So we'd likely want to update multiple files and API paths to be more descriptive if that's what we're going for. I also think a point could be made that a user wanting to use the Vela API shouldn't solely rely on the endpoint pathing. Feels like thats the purpose for the other docs like the OpenAPI (Swagger) spec and supplemental Vela API section: |
|
That's fair. I think with |
Dependent on #574, #626, and #627
Part of the effort for go-vela/community#460
This BREAKING CHANGE adds first class support for
pipelinesin the system.API
This adds CRUD support for
pipelinesto thegithub.amrom.workers.dev/go-vela/server/apipackage.Previously, we hosted a suite of API endpoints for
pipelinesthat enabled various interactions with them:server/router/pipeline.go
Lines 17 to 21 in 29562a4
A
refquery param (usually with a commit SHA as the value) was used to tell Vela which pipeline to fetch for a repo.If no
refquery param was provided, then Vela would use the default branch for the repo.Now, these endpoints have been modified to require the commit SHA in the path (
:pipeline):server/router/pipeline.go
Lines 19 to 27 in da3142e
This means the lookup for
pipelinesstored in our system is based off the commit SHA provided in the path.If a pipeline hasn't been stored in the Vela system with that commit SHA, we'll now return a
404 Not Founderror.Compiler
Both the
Compile()andCompileLite()functions have been updated to return a*library.Pipelinetype:server/compiler/engine.go
Lines 20 to 28 in da3142e
This returned object will have most of its fields set excluding
repo_id,commitandref(set from thebuild):server/compiler/native/compile.go
Lines 50 to 53 in da3142e
The
Parse()function was also updated to return a[]byte:server/compiler/engine.go
Lines 34 to 36 in da3142e
This contains the raw pipeline, meaning whatever file thats fetched from the source provider.
This is used to set the
datafield which will be compressed before storing in the database.Webhook
The webhook workflow has been modified to start storing
pipelinesfetched during this process in the Vela system.For every received webhook, we make a call to the database to see if that pipeline already exists:
server/api/webhook.go
Lines 386 to 402 in da3142e
If it doesn't, then we fallback to fetching the pipeline configuration file from the SCM.
After we've ensured its not an empty build (only
initand/orclone), we store the pipeline in the database:server/api/webhook.go
Lines 506 to 536 in da3142e