3 Pipeline Macro Library
(require shell/pipeline-macro) | package: shell-pipeline |
3.1 shell/pipeline-macro stability
This library is not entirely stable.
The base set of pipeline operators is likely to change, and some of the names I want to review before a stable release.
3.2 shell/pipeline-macro guide
This module is a macro DSL wrapper around the Mixed Unix-style and Racket Object Pipelines library. It is designed for running pipelines of external processes (which pass each other ports) and Racket functions (which pass each other objects). It does this with a very flat syntax and user-definable pipeline operators, which provide a lot of convenient sugar for making pipelines shorter. It is particularly tailored for use in a line-based syntax, like that of the Rash language.
Here are some quick examples:
;; Pipe the output of ls to grep. |
(run-pipeline =unix-pipe= ls -l =unix-pipe= grep foo) |
|
;; To save on space, let's assume % is bound to =unix-pipe= |
(run-pipeline % ls -l % grep foo) |
We can also pipeline objects. Object pipelines are full of functions instead of process specifications.
;; This will return 2 |
(run-pipeline =object-pipe= list 1 2 3 =object-pipe= second) |
|
;; To save on space, let's assume %> is bound to =object-pipe= |
(run-pipeline %> list 1 2 3 %> second) |
We can mix the two:
;; Capitalized ls output. =object-pipe= automatically converts ports to strings. |
(run-pipeline % ls -l %> string-upcase) |
I am really running out of steam for documenting right now... TODO - write a good guide.
3.3 shell/pipeline-macro reference
3.3.1 Running Pipelines
syntax
(run-pipeline pipeline-flag ... pipeline-member-spec ... pipeline-flag ...)
pipeline-member-spec = pipe-operator pipe-operator-arg ... pipeline-flag = &bg | &pipeline-ret | &in file-expression | &< file-name | &out file-expression | &> file-name | &>> file-name | &>! file-name | &err file-expression | &strict | &permissive | &lazy | &lazy-timeout timeout-expression | &env env-expression
The pipeline flags affect the options passed to shell/mixed-pipeline/run-pipeline and are documented separately.
The pipeline-member-specs are transformed according to the pipeline operators given. If the first non-flag argument to run-pipeline is not a pipeline operator, then a default is put in its place as determined by default-pipeline-operator. The full names of pipeline operators are conventionally identifiers surrounded with = signs.
At the time of writing I’m not really sure what to write here, so have an example:
(run-pipeline =object-pipe= list 1 2 3 =for/list= + 1 current-pipeline-argument =for/list= + 1)
(list 3 4 5)
(run-pipeline =object-pipe= list 1 2 3 =for/list= + 1 current-pipeline-argument =for/list= + 1 &bg)
3.3.2 Pipeline Flags
pipeline-flag
(&bg)
pipeline-flag
pipeline-flag
(&in port-expression)
pipeline-flag
(&< file-name)
pipeline-flag
(&out port/reader-expression)
pipeline-flag
(&> file-name)
pipeline-flag
(&>! file-name)
pipeline-flag
(&>> file-name)
pipeline-flag
(&err port-expression)
pipeline-flag
(&strict)
pipeline-flag
pipeline-flag
(&lazy)
pipeline-flag
&<, &>, &>>, and &>! each take a file name and cause (respectively) input redirection from the given file, output redirection to the given file erroring if the file exists, output redirection appending to the given file, and output redirection truncating the given file. &in, &out, and &err take an argument suitable to be passed to #:in, #:out, and #:err of shell/mixed-pipeline/run-pipeline.
&bg and &pipeline-ret toggle #:bg and #:return-pipeline-object, and &strict, &permissive, and &lazy set the #:strictness argument.
3.3.3 Pipeline Operators
The core module only provides a few simple pipeline operators. There are many more in the demo/ directory in the source repository. Most of them are hastily written experiments, but some good ones should eventually be standardized.
pipeline-operator
(=composite-pipe= (pipe-op arg ...) ...+)
pipeline-operator
(=basic-unix-pipe= options ... args ...+)
Options all take an argument, must precede any arguments, and are as follows:
#:as - This is sugar for adding on an object pipeline member afterward that parses the output somehow. This should be given either #f (no transformation), a port reading function (eg. port->string), or one of a pre-set list of symbols: 'string, 'trim, 'lines, or 'words.
#:e> - Accepts a file name (as an identifier), redirects the error stream to that file. Produces an error if the file exists.
#:e>! - Accepts a file name (as an identifier), redirects the error stream to that file. Truncates the file if it exists.
#:e>> - Accepts a file name (as an identifier), redirects the error stream to that file. Appends to the file if it exists.
#:err - Takes an expression to produce an error redirection value suitable for unix-pipeline-member-spec.
#:success - Takes an expression suitable for the #:success argument of unix-pipeline-member-spec.
TODO - env modification
pipeline-operator
(=quoting-basic-unix-pipe= options ... args ...+)
(define x "/etc") |
(define-syntax id (syntax-parser [(_ x) #'x])) |
|
;; I find I really don't mind this as a means of unquoting here. |
(run-pipeline =quoting-basic-unix-pipe= ls (id x)) |
pipeline-operator
(=unix-pipe= arg ...+)
After all that expansion, it passes through to =quoting-basic-unix-pipe=.
However, if the first argument is a pipeline alias defined with define-pipeline-alias or define-simple-pipeline-alias, then the operator from that alias is swapped in instead, skipping everything else that this operator would normally do.
(run-pipeline =unix-pipe= echo $HOME/*.rkt) |
(define-simple-pipeline-alias d 'ls '--color=auto) |
(define dfdir 'dotfiles) |
(run-pipeline =unix-pipe= d $HOME/$dfdir) |
pipeline-operator
(\| arg ...+)
Note that the backslash is required in the normal racket reader because | is normally treated specially. In the Rash reader, you can get this by typing just |.
pipeline-operator
pipeline-operator
(=basic-object-pipe/form= arg ...+)
As with other object pipes, when used as a pipeline starter it generates a lambda with no arguments, and as a pipeline joint it generates a lambda with one argument, current-pipeline-argument.
pipeline-operator
(=basic-object-pipe= arg ...+)
pipeline-operator
(\|> arg ...+)
Note that the backslash is required in the normal racket reader because | is normally treated specially. In the Rash reader, you can get this by typing just |>.
pipeline-operator
(=object-pipe/expression= arg ...+)
pipeline-operator
(=object-pipe/form= arg ...+)
pipeline-operator
(=object-pipe= arg ...+)
pipeline-operator
(\|>> arg ...+)
Note that the backslash is required in the normal racket reader because | is normally treated specially. In the Rash reader, you can get this by typing just |>>.
pipeline-operator
I’ve written various other pipeline operators that are a little more exciting and that are currently in the demo directory of the repository. I’ll eventually polish them up and put them somewhere stable. They include things like unix pipes that automatically glob things, unix pipes that have lexically scoped alias resolution, =filter=, =for/list=, =for/stream=,=for/list/unix-arg=,=for/list/unix-input=...
3.3.4 Defining Pipeline Operators
syntax
(define-pipeline-operator name start-or-joint ...)
start-or-joint = #:start transformer | #:joint transformer
If a transformer function is not specified for one of the options, a default implementation (that generates an error) is used.
The transformer will receive a syntax object corresponding to (name-of-pipe argument ...), so it will likely want to ignore its first argument like most macros do. But simetimes it may be useful to recur.
Example uses are in the demo directory in the repository.
syntax
(pipeop name syntax-parser-clauses ...+)
pipeop is a more streamlined version of define-pipeline-operator. It defines a pipeline operator where both #:start and #:joint are the same. The syntax transformer used is basically (syntax-parser syntax-parser-clauses ...). I made this because I thought it would be a convenient way to be able to swiftly define a new pipeline even interactively in the Rash repl.
Example uses are in the demo directory.
syntax
(define-pipeline-alias name transformer)
transformer must be a syntax transformer function, and must return a syntax object that starts with a pipeline operator.
;; Unix `find` has to take the directory first, but we want |
;; to always add the -type f flag at the end. |
(define-pipeline-alias find-f |
(syntax-parser |
[(_ arg ...) #'(=unix-pipe= find arg ... -type f)])) |
|
;; these are equivalent |
(run-pipeline =unix-pipe= find-f .) |
(run-pipeline =unix-pipe= find . -type f) |
syntax
(define-simple-pipeline-alias name cmd arg ...)
(define-simple-pipeline-alias ls 'ls '--color=auto) |
;; these are equivalent |
(run-pipeline =unix-pipe= d -l $HOME) |
(run-pipeline =unix-pipe= 'ls '--color=auto -l $HOME) |
syntax
syntax
3.3.5 Inspecting Pipelines
procedure
pipeline? : procedure?
procedure
pipeline-success? : procedure?
procedure
pipeline-wait : procedure?
procedure
pipeline-return : procedure?
procedure
pipeline-start-ms : procedure?
procedure
pipeline-end-ms : procedure?