Skip to content

Conversation

@jbrockopp
Copy link
Contributor

@jbrockopp jbrockopp commented Mar 18, 2022

Dependent on #574, #626, and #627

Part of the effort for go-vela/community#460

This BREAKING CHANGE adds first class support for pipelines in the system.

API

This adds CRUD support for pipelines to the github.com/go-vela/server/api package.

Previously, we hosted a suite of API endpoints for pipelines that enabled various interactions with them:

// GET /api/v1/pipelines/:org/:repo
// GET /api/v1/pipelines/:org/:repo/templates
// POST /api/v1/pipelines/:org/:repo/expand
// POST /api/v1/pipelines/:org/:repo/compile
// POST /api/v1/pipelines/:org/:repo/validate .

A ref query param (usually with a commit SHA as the value) was used to tell Vela which pipeline to fetch for a repo.

If no ref query param was provided, then Vela would use the default branch for the repo.

Now, these endpoints have been modified to require the commit SHA in the path (:pipeline):

// POST /api/v1/pipelines/:org/:repo
// GET /api/v1/pipelines/:org/:repo
// GET /api/v1/pipelines/:org/:repo/:pipeline
// PUT /api/v1/pipelines/:org/:repo/:pipeline
// DELETE /api/v1/pipelines/:org/:repo/:pipeline
// GET /api/v1/pipelines/:org/:repo/:pipeline/templates
// POST /api/v1/pipelines/:org/:repo/:pipeline/expand
// POST /api/v1/pipelines/:org/:repo/:pipeline/compile
// POST /api/v1/pipelines/:org/:repo/:pipeline/validate .

This means the lookup for pipelines stored in our system is based off the commit SHA provided in the path.

If a pipeline hasn't been stored in the Vela system with that commit SHA, we'll now return a 404 Not Found error.

Compiler

Both the Compile() and CompileLite() functions have been updated to return a *library.Pipeline type:

// Compile defines a function that produces an executable
// representation of a pipeline from an object. This calls
// Parse internally to convert the object to a yaml configuration.
Compile(interface{}) (*pipeline.Build, *library.Pipeline, error)
// CompileLite defines a function that produces an light executable
// representation of a pipeline from an object. This calls
// Parse internally to convert the object to a yaml configuration.
CompileLite(interface{}, bool, bool, []string) (*yaml.Build, *library.Pipeline, error)

This returned object will have most of its fields set excluding repo_id, commit and ref (set from the build):

// create the library pipeline object from the yaml configuration
_pipeline := p.ToPipelineLibrary()
_pipeline.SetData(data)
_pipeline.SetType(c.repo.GetPipelineType())

The Parse() function was also updated to return a []byte:

// Parse defines a function that converts
// an object to a yaml configuration.
Parse(interface{}, string, map[string]interface{}) (*yaml.Build, []byte, error)

This contains the raw pipeline, meaning whatever file thats fetched from the source provider.

This is used to set the data field which will be compressed before storing in the database.

Webhook

The webhook workflow has been modified to start storing pipelines fetched during this process in the Vela system.

For every received webhook, we make a call to the database to see if that pipeline already exists:

server/api/webhook.go

Lines 386 to 402 in da3142e

// send API call to attempt to capture the pipeline
pipeline, err = database.FromContext(c).GetPipelineForRepo(b.GetCommit(), r)
if err != nil { // assume the pipeline doesn't exist in the database yet
// send API call to capture the pipeline configuration file
config, err = scm.FromContext(c).ConfigBackoff(u, r, b.GetCommit())
if err != nil {
retErr := fmt.Errorf("%s: failed to get pipeline configuration for %s: %w", baseErr, r.GetFullName(), err)
util.HandleError(c, http.StatusNotFound, retErr)
h.SetStatus(constants.StatusFailure)
h.SetError(retErr.Error())
return
}
} else {
config = pipeline.GetData()
}

If it doesn't, then we fallback to fetching the pipeline configuration file from the SCM.

After we've ensured its not an empty build (only init and/or clone), we store the pipeline in the database:

server/api/webhook.go

Lines 506 to 536 in da3142e

// check if the pipeline did not already exist in the database
if pipeline == nil {
pipeline = compiled
pipeline.SetRepoID(r.GetID())
pipeline.SetCommit(b.GetCommit())
pipeline.SetRef(b.GetRef())
// send API call to create the pipeline
err = database.FromContext(c).CreatePipeline(pipeline)
if err != nil {
retErr := fmt.Errorf("%s: failed to create pipeline for %s: %w", baseErr, r.GetFullName(), err)
util.HandleError(c, http.StatusBadRequest, retErr)
h.SetStatus(constants.StatusFailure)
h.SetError(retErr.Error())
return
}
// send API call to capture the created pipeline
pipeline, err = database.FromContext(c).GetPipelineForRepo(pipeline.GetCommit(), r)
if err != nil {
// nolint: lll // ignore long line length due to error message
retErr := fmt.Errorf("%s: failed to get new pipeline %s/%s: %w", baseErr, r.GetFullName(), pipeline.GetCommit(), err)
util.HandleError(c, http.StatusInternalServerError, retErr)
h.SetStatus(constants.StatusFailure)
h.SetError(retErr.Error())
return
}

@jbrockopp jbrockopp self-assigned this Mar 18, 2022
@codecov
Copy link

codecov bot commented Mar 18, 2022

Codecov Report

Merging #615 (da3142e) into master (ccfff2a) will increase coverage by 0.25%.
The diff coverage is 17.66%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #615      +/-   ##
==========================================
+ Coverage   55.22%   55.47%   +0.25%     
==========================================
  Files         196      195       -1     
  Lines       15781    15741      -40     
==========================================
+ Hits         8715     8733      +18     
+ Misses       6704     6644      -60     
- Partials      362      364       +2     
Impacted Files Coverage Δ
api/build.go 1.66% <0.00%> (-0.20%) ⬇️
api/webhook.go 0.00% <0.00%> (ø)
compiler/native/compile.go 61.75% <50.79%> (+0.13%) ⬆️
compiler/native/parse.go 84.61% <85.71%> (+0.97%) ⬆️

@jbrockopp jbrockopp changed the title feat(api): add support for pipelines feat(api)!: add support for pipelines Mar 25, 2022
@jbrockopp jbrockopp marked this pull request as ready for review April 22, 2022 18:12
@jbrockopp jbrockopp requested a review from a team as a code owner April 22, 2022 18:12
cognifloyd
cognifloyd previously approved these changes Apr 23, 2022
Copy link
Member

@cognifloyd cognifloyd left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It might be large, but it is repetitive (follows a pattern), so it wasn't that bad to review. Thanks for the overview. Looks great to me!

Copy link
Collaborator

@wass3r wass3r left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nice work. first pass - see comments

one additional nitpick.. not sure if it's just me, but i'd almost like to see :pipeline changed to :sha in the paths (eg. /api/v1/pipelines/:org/:repo/:pipeline) or something. maybe that's leftover from when you used IDs? for documentation purposes that seems more clear to me.

"pipeline": p.GetCommit(),
"repo": r.GetName(),
"user": u.GetName(),
}).Infof("compiling pipeline %s", entry)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

with the given fields, the contents of %s don't provide any new information. should we drop it?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We certainly could 👍 I'm leaning towards leaving it in there since it doesn't hurt.

This pattern is adopted from existing log entires we already established for other API endpoints.

So if we did want to switch this up, I think we'd have to go back and apply the same changes for consistency.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fair point, I'd vote for removing it maybe in another PR. Seems like a waste of bytes that can add up - then again, this is more compact than logging as separate JSON fields :disappear:

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah I see where you're coming from on that 👍

To me, the message for the log entry is designed for human processing

i.e. user who manually reviews the log entries or using a tool such as grep or awk

In those scenarios, it can add a bit of complexity when trying to search for multiple fields but not impossible:

# searching for specific message
cat log.txt | grep '<org>/<repo>/<commit>'

# searching for specific fields
cat log.txt | grep '"org": <org>' | grep '"repo": <repo>' | grep '"pipeline": <commit>'

While the fields for the log entry are designed for system processing

i.e. most apps/tools consuming logs enable filtering on one or more fields which improves this experience

However, if folks feel strongly enough about it, then I could get behind shortening the message

Copy link
Collaborator

@wass3r wass3r left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks for addressing feedback. i see there were no hot takes on :pipeline vs :sha or something similar in URL path for docs. it's not crucial, just something I felt took extra brain cycles to understand what is expected, though the description for the param define it more.

I'll let this comment sit for a bit to allow for response or others to chime in. otherwise 👍🏼

@jbrockopp
Copy link
Contributor Author

@wass3r I think I could get behind that but I'd want to encourage that as a separate PR 👍

The reason for that is because it would divert from the existing pattern we have today.

e.g.

  • /api/v1/repos/:org/:repo - we use :repo instead of :name or :repoName
  • /api/v1/repos/:org/:repo/builds/:build - we use :build instead of :number or :buildNumber
  • /api/v1/deployments/:org/:repo/:deployment - we use :deployment instead of :number or :deploymentNumber
  • /api/v1/hooks/:org/:repo/:hook - we use :hook instead of :number or :hookNumber

So we'd likely want to update multiple files and API paths to be more descriptive if that's what we're going for.

I also think a point could be made that a user wanting to use the Vela API shouldn't solely rely on the endpoint pathing.

Feels like thats the purpose for the other docs like the OpenAPI (Swagger) spec and supplemental Vela API section:

https://go-vela.github.io/docs/reference/api/

@wass3r
Copy link
Collaborator

wass3r commented Apr 26, 2022

That's fair. I think with :repo (and most other path fragments), it's relatively intuitive to infer that you're looking for either "name" or "number", but :pipeline being something totally different kind of throws it for a loop. But I digress - I prefaced with it being a nitpick and not something that ought to stop the merge. Thanks for the additional comment. And I agree that there is documentation elsewhere to elaborate.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

feature Indicates a new feature

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants