-
Notifications
You must be signed in to change notification settings - Fork 28
update dasht-query-line and dasht-query-html to work with tsv #56
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from 1 commit
dd09eb2
3ecd91b
3be5280
f37f12d
c57d668
d385529
6c1e309
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
- Loading branch information
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -4,7 +4,7 @@ | |
| # | ||
| # ## NAME | ||
| # | ||
| # dasht-query-line - searches [Dash] docsets and emits groups of lines | ||
| # dasht-query-line - searches [Dash] docsets and emits results as tsv | ||
smackesey marked this conversation as resolved.
Outdated
Show resolved
Hide resolved
|
||
| # | ||
| # ## SYNOPSIS | ||
| # | ||
|
|
@@ -28,8 +28,8 @@ | |
| # | ||
| # Searches for *PATTERN* in all installed [Dash] docsets, optionally searching | ||
| # only in those whose names match *DOCSET*s, by calling dasht-query-exec(1) | ||
| # and emits the results in groups of lines, as described in "Results" below. | ||
| # However, if no results were found, this program exits with a nonzero status. | ||
| # and emits the results as TSV. However, if no results were found, this program | ||
|
||
| # exits with a nonzero status. | ||
| # | ||
| # ### Searching | ||
| # | ||
|
|
@@ -42,26 +42,24 @@ | |
| # | ||
| # ### Results | ||
| # | ||
| # Each search result is printed to stdout as a group of four lines of text: | ||
| # Each search result is printed to stdout as a tab-separated line with fields: | ||
|
||
| # | ||
| # `name` `=` *VALUE* | ||
| # `name` | ||
| # Name of the token that matched the *PATTERN*. | ||
| # | ||
| # `type` `=` *VALUE* | ||
| # `type` | ||
| # Type of the token, as defined in the docset. | ||
| # | ||
| # `from` `=` *VALUE* | ||
| # `from` | ||
| # Name of the docset this result was found in. | ||
| # | ||
| # `url` `=` *VALUE* | ||
| # `url` | ||
| # URL of the API documentation for this result. | ||
| # | ||
| # For example, here is a search result for "c - x" from the "bash" docset: | ||
| # For example, here is a search result for "c - x" from the "bash" docset, with | ||
| # tab characters represented by "<TAB>": | ||
| # | ||
| # name = undo (C-_ or C-x C-u) | ||
| # type = Function | ||
| # from = Bash | ||
| # url = file:///home/sunny/.local/share/dasht/docsets/Bash.docset/Contents/Resources/Documents/bash/Miscellaneous-Commands.html#//apple_ref/Function/undo%20%28C%2D%5F%20or%20C%2Dx%20C%2Du%29 | ||
| # undo (C-_ or C-x C-u)<TAB>Function<TAB>Bash<TAB>file:///home/sunny/.local/share/dasht/docsets/Bash.docset/Contents/Resources/Documents/bash/Miscellaneous-Commands.html#//apple_ref/Function/undo%20%28C%2D%5F%20or%20C%2Dx%20C%2Du%29 | ||
| # | ||
| # ## ENVIRONMENT | ||
| # | ||
|
|
@@ -245,9 +243,11 @@ dasht-docsets "$@" | while read -r docset; do | |
|
|
||
| { $1 = $1 } # strip whitespace from key | ||
|
|
||
| $2 == "=" { | ||
| result[$1] = substr($0, index($0, $2) + length($2) + 1) | ||
|
||
| } | ||
|
|
||
| $1 == "url" { were_any_results_found=1 | ||
| # indicate the source of this result | ||
| print "from = " docset | ||
|
|
||
| # strip embedded XML from result URL | ||
| gsub("<.*>", "", $3) | ||
|
|
@@ -259,9 +259,12 @@ dasht-docsets "$@" | while read -r docset; do | |
|
|
||
| # resolve URL to filesystem location | ||
| $3 = file_url $3 | ||
| } | ||
|
|
||
| /./ # reject any empty lines from input | ||
| # print TSV line | ||
|
||
| printf "%s\t%s\t%s\t%s\n", result["name"], docset, \ | ||
|
||
| result["type"], $3 | ||
|
|
||
| } | ||
|
|
||
| END { exit !were_any_results_found } | ||
| ' && kill -s USR1 $$ || : # notify this script if any results were found | ||
|
|
||
Uh oh!
There was an error while loading. Please reload this page.