Fastfix for previous two commits (sorry) and chroma update.
Well, doing "git add ." not from repository's root is a bad idea. Updated chroma dependency, added more languages to syntax highlighting.
This commit is contained in:
56
vendor/github.com/alecthomas/chroma/README.md
generated
vendored
56
vendor/github.com/alecthomas/chroma/README.md
generated
vendored
@@ -12,24 +12,49 @@ translators for Pygments lexers and styles.
|
||||
|
||||
<!-- MarkdownTOC -->
|
||||
|
||||
- [Supported languages](#supported-languages)
|
||||
- [Using the library](#using-the-library)
|
||||
- [Quick start](#quick-start)
|
||||
- [Identifying the language](#identifying-the-language)
|
||||
- [Formatting the output](#formatting-the-output)
|
||||
- [The HTML formatter](#the-html-formatter)
|
||||
- [More detail](#more-detail)
|
||||
- [Lexers](#lexers)
|
||||
- [Formatters](#formatters)
|
||||
- [Styles](#styles)
|
||||
- [Command-line interface](#command-line-interface)
|
||||
- [What's missing compared to Pygments?](#whats-missing-compared-to-pygments)
|
||||
1. [Supported languages](#supported-languages)
|
||||
1. [Using the library](#using-the-library)
|
||||
1. [Quick start](#quick-start)
|
||||
1. [Identifying the language](#identifying-the-language)
|
||||
1. [Formatting the output](#formatting-the-output)
|
||||
1. [The HTML formatter](#the-html-formatter)
|
||||
1. [More detail](#more-detail)
|
||||
1. [Lexers](#lexers)
|
||||
1. [Formatters](#formatters)
|
||||
1. [Styles](#styles)
|
||||
1. [Command-line interface](#command-line-interface)
|
||||
1. [What's missing compared to Pygments?](#whats-missing-compared-to-pygments)
|
||||
|
||||
<!-- /MarkdownTOC -->
|
||||
|
||||
## Supported languages
|
||||
|
||||
ABNF, ANTLR, APL, ActionScript, ActionScript 3, Ada, Angular2, ApacheConf, AppleScript, Awk, BNF, Base Makefile, Bash, Batchfile, BlitzBasic, Brainfuck, C, C#, C++, CFEngine3, CMake, COBOL, CSS, Cap'n Proto, Ceylon, ChaiScript, Cheetah, Clojure, CoffeeScript, Common Lisp, Coq, Crystal, Cython, DTD, Dart, Diff, Django/Jinja, Docker, EBNF, Elixir, Elm, EmacsLisp, Erlang, FSharp, Factor, Fish, Forth, Fortran, GAS, GDScript, GLSL, Genshi, Genshi HTML, Genshi Text, Gnuplot, Go, Groovy, HTML, Handlebars, Haskell, Haxe, Hexdump, Hy, INI, Idris, Io, JSON, Java, JavaScript, Julia, Kotlin, LLVM, Lighttpd configuration file, Lua, Mako, Mason, Mathematica, MiniZinc, Modula-2, MySQL, Myghty, NASM, Newspeak, Nginx configuration file, Nim, OCaml, Octave, PHP, PL/pgSQL, POVRay, PacmanConf, Perl, Pig, PkgConfig, PostScript, PostgreSQL SQL dialect, PowerShell, Prolog, Protocol Buffer, Puppet, Python, Python 3, QBasic, R, Racket, Ragel, Rexx, Ruby, Rust, SPARQL, SQL, Sass, Scala, Scheme, Scilab, Smalltalk, Smarty, Snobol, SquidConf, SVG, Swift, TASM, Tcl, Tcsh, Termcap, Terminfo, Terraform, Thrift, Transact-SQL, Turtle, Twig, TypeScript, TypoScript, TypoScriptCssData, TypoScriptHtmlData, VHDL, VimL, XML, Xorg, YAML, cfstatement, markdown, reStructuredText, reg, systemverilog, verilog
|
||||
Prefix | Language
|
||||
:----: | --------
|
||||
A | ABNF, ActionScript, ActionScript 3, Ada, Angular2, ANTLR, ApacheConf, APL, AppleScript, Awk
|
||||
B | Ballerina, Base Makefile, Bash, Batchfile, BlitzBasic, BNF, Brainfuck
|
||||
C | C, C#, C++, Cassandra CQL, CFEngine3, cfstatement/ColdFusion, CMake, COBOL, CSS, Cap'n Proto, Ceylon, ChaiScript, Cheetah, Clojure, CoffeeScript, Common Lisp, Coq, Crystal, Cython
|
||||
D | Dart, Diff, Django/Jinja, Docker, DTD
|
||||
E | EBNF, Elixir, Elm, EmacsLisp, Erlang
|
||||
F | Factor, Fish, Forth, Fortran, FSharp
|
||||
G | GAS, GDScript, GLSL, Genshi, Genshi HTML, Genshi Text, Gnuplot, Go, Go HTML Template, Go Text Template, Groovy
|
||||
H | Handlebars, Haskell, Haxe, Hexdump, HTML, HTTP, Hy
|
||||
I | Idris, INI, Io
|
||||
J | Java, JavaScript, JSON, Jsx, Julia, Jungle
|
||||
K | Kotlin
|
||||
L | Lighttpd configuration file, LLVM, Lua
|
||||
M | Mako, Markdown, Mason, Mathematica, MiniZinc, Modula-2, MonkeyC, MorrowindScript, Myghty, MySQL
|
||||
N | NASM, Newspeak, Nginx configuration file, Nim, Nix
|
||||
O | Objective-C, OCaml, Octave, OpenSCAD, Org Mode
|
||||
P | PacmanConf, Perl, PHP, Pig, PkgConfig, Plaintext, PL/pgSQL, PostgreSQL SQL dialect, PostScript, POVRay, PowerShell, Prolog, Protocol Buffer, Puppet, Python, Python 3
|
||||
Q | QBasic
|
||||
R | R, Racket, Ragel, reg, reStructuredText, Rexx, Ruby, Rust
|
||||
S | Sass, Scala, Scheme, Scilab, SCSS, Smalltalk, Smarty, Snobol, Solidity, SPARQL, SQL, SquidConf, Swift, systemd, Systemverilog
|
||||
T | TASM, Tcl, Tcsh, Termcap, Terminfo, Terraform, TeX, Thrift, TOML, TradingView, Transact-SQL, Turtle, Twig, TypeScript, TypoScript, TypoScriptCssData, TypoScriptHtmlData
|
||||
V | verilog, VHDL, VimL
|
||||
W | WDTE
|
||||
X | XML, Xorg
|
||||
Y | YAML
|
||||
|
||||
_I will attempt to keep this section up to date, but an authoritative list can be
|
||||
displayed with `chroma --list`._
|
||||
@@ -133,7 +158,7 @@ err := formatter.Format(w, style, iterator)
|
||||
### The HTML formatter
|
||||
|
||||
By default the `html` registered formatter generates standalone HTML with
|
||||
embedded CSS. More flexibility is available through the `lexers/html` package.
|
||||
embedded CSS. More flexibility is available through the `formatters/html` package.
|
||||
|
||||
Firstly, the output generated by the formatter can be customised with the
|
||||
following constructor options:
|
||||
@@ -144,6 +169,7 @@ following constructor options:
|
||||
- `TabWidth(width)` - Set the rendered tab width, in characters.
|
||||
- `WithLineNumbers()` - Render line numbers (style with `LineNumbers`).
|
||||
- `HighlightLines(ranges)` - Highlight lines in these ranges (style with `LineHighlight`).
|
||||
- `LineNumbersInTable()` - Use a table for formatting line numbers and code, rather than spans.
|
||||
|
||||
If `WithClasses()` is used, the corresponding CSS can be obtained from the formatter with:
|
||||
|
||||
@@ -187,6 +213,8 @@ Chroma styles use the [same syntax](http://pygments.org/docs/styles/) as Pygment
|
||||
|
||||
All Pygments styles have been converted to Chroma using the `_tools/style.py` script.
|
||||
|
||||
For a quick overview of the available styles and how they look, check out the [Chroma Style Gallery](https://xyproto.github.io/splash/docs/).
|
||||
|
||||
## Command-line interface
|
||||
|
||||
A command-line interface to Chroma is included. It can be installed with:
|
||||
|
13
vendor/github.com/alecthomas/chroma/coalesce.go
generated
vendored
13
vendor/github.com/alecthomas/chroma/coalesce.go
generated
vendored
@@ -6,14 +6,17 @@ func Coalesce(lexer Lexer) Lexer { return &coalescer{lexer} }
|
||||
type coalescer struct{ Lexer }
|
||||
|
||||
func (d *coalescer) Tokenise(options *TokeniseOptions, text string) (Iterator, error) {
|
||||
var prev *Token
|
||||
var prev Token
|
||||
it, err := d.Lexer.Tokenise(options, text)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
return func() *Token {
|
||||
for token := it(); token != nil; token = it() {
|
||||
if prev == nil {
|
||||
return func() Token {
|
||||
for token := it(); token != (EOF); token = it() {
|
||||
if len(token.Value) == 0 {
|
||||
continue
|
||||
}
|
||||
if prev == EOF {
|
||||
prev = token
|
||||
} else {
|
||||
if prev.Type == token.Type && len(prev.Value) < 8192 {
|
||||
@@ -26,7 +29,7 @@ func (d *coalescer) Tokenise(options *TokeniseOptions, text string) (Iterator, e
|
||||
}
|
||||
}
|
||||
out := prev
|
||||
prev = nil
|
||||
prev = EOF
|
||||
return out
|
||||
}, nil
|
||||
}
|
||||
|
10
vendor/github.com/alecthomas/chroma/colour.go
generated
vendored
10
vendor/github.com/alecthomas/chroma/colour.go
generated
vendored
@@ -60,10 +60,12 @@ func NewColour(r, g, b uint8) Colour {
|
||||
// This uses the approach described here (https://www.compuphase.com/cmetric.htm).
|
||||
// This is not as accurate as LAB, et. al. but is *vastly* simpler and sufficient for our needs.
|
||||
func (c Colour) Distance(e2 Colour) float64 {
|
||||
rmean := int(c.Red()+e2.Red()) / 2
|
||||
r := int(c.Red() - e2.Red())
|
||||
g := int(c.Green() - e2.Green())
|
||||
b := int(c.Blue() - e2.Blue())
|
||||
ar, ag, ab := int64(c.Red()), int64(c.Green()), int64(c.Blue())
|
||||
br, bg, bb := int64(e2.Red()), int64(e2.Green()), int64(e2.Blue())
|
||||
rmean := (ar + br) / 2
|
||||
r := ar - br
|
||||
g := ag - bg
|
||||
b := ab - bb
|
||||
return math.Sqrt(float64((((512 + rmean) * r * r) >> 8) + 4*g*g + (((767 - rmean) * b * b) >> 8)))
|
||||
}
|
||||
|
||||
|
137
vendor/github.com/alecthomas/chroma/delegate.go
generated
vendored
Normal file
137
vendor/github.com/alecthomas/chroma/delegate.go
generated
vendored
Normal file
@@ -0,0 +1,137 @@
|
||||
package chroma
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
)
|
||||
|
||||
type delegatingLexer struct {
|
||||
root Lexer
|
||||
language Lexer
|
||||
}
|
||||
|
||||
// DelegatingLexer combines two lexers to handle the common case of a language embedded inside another, such as PHP
|
||||
// inside HTML or PHP inside plain text.
|
||||
//
|
||||
// It takes two lexer as arguments: a root lexer and a language lexer. First everything is scanned using the language
|
||||
// lexer, which must return "Other" for unrecognised tokens. Then all "Other" tokens are lexed using the root lexer.
|
||||
// Finally, these two sets of tokens are merged.
|
||||
//
|
||||
// The lexers from the template lexer package use this base lexer.
|
||||
func DelegatingLexer(root Lexer, language Lexer) Lexer {
|
||||
return &delegatingLexer{
|
||||
root: root,
|
||||
language: language,
|
||||
}
|
||||
}
|
||||
|
||||
func (d *delegatingLexer) Config() *Config {
|
||||
return d.language.Config()
|
||||
}
|
||||
|
||||
// An insertion is the character range where language tokens should be inserted.
|
||||
type insertion struct {
|
||||
start, end int
|
||||
tokens []Token
|
||||
}
|
||||
|
||||
func (d *delegatingLexer) Tokenise(options *TokeniseOptions, text string) (Iterator, error) {
|
||||
tokens, err := Tokenise(Coalesce(d.language), options, text)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
// Compute insertions and gather "Other" tokens.
|
||||
others := &bytes.Buffer{}
|
||||
insertions := []*insertion{}
|
||||
var insert *insertion
|
||||
offset := 0
|
||||
var last Token
|
||||
for _, t := range tokens {
|
||||
if t.Type == Other {
|
||||
if last != EOF && insert != nil && last.Type != Other {
|
||||
insert.end = offset
|
||||
}
|
||||
others.WriteString(t.Value)
|
||||
} else {
|
||||
if last == EOF || last.Type == Other {
|
||||
insert = &insertion{start: offset}
|
||||
insertions = append(insertions, insert)
|
||||
}
|
||||
insert.tokens = append(insert.tokens, t)
|
||||
}
|
||||
last = t
|
||||
offset += len(t.Value)
|
||||
}
|
||||
|
||||
if len(insertions) == 0 {
|
||||
return d.root.Tokenise(options, text)
|
||||
}
|
||||
|
||||
// Lex the other tokens.
|
||||
rootTokens, err := Tokenise(Coalesce(d.root), options, others.String())
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
// Interleave the two sets of tokens.
|
||||
var out []Token
|
||||
offset = 0 // Offset into text.
|
||||
tokenIndex := 0
|
||||
nextToken := func() Token {
|
||||
if tokenIndex >= len(rootTokens) {
|
||||
return EOF
|
||||
}
|
||||
t := rootTokens[tokenIndex]
|
||||
tokenIndex++
|
||||
return t
|
||||
}
|
||||
insertionIndex := 0
|
||||
nextInsertion := func() *insertion {
|
||||
if insertionIndex >= len(insertions) {
|
||||
return nil
|
||||
}
|
||||
i := insertions[insertionIndex]
|
||||
insertionIndex++
|
||||
return i
|
||||
}
|
||||
t := nextToken()
|
||||
i := nextInsertion()
|
||||
for t != EOF || i != nil {
|
||||
// fmt.Printf("%d->%d:%q %d->%d:%q\n", offset, offset+len(t.Value), t.Value, i.start, i.end, Stringify(i.tokens...))
|
||||
if t == EOF || (i != nil && i.start < offset+len(t.Value)) {
|
||||
var l Token
|
||||
l, t = splitToken(t, i.start-offset)
|
||||
if l != EOF {
|
||||
out = append(out, l)
|
||||
offset += len(l.Value)
|
||||
}
|
||||
out = append(out, i.tokens...)
|
||||
offset += i.end - i.start
|
||||
if t == EOF {
|
||||
t = nextToken()
|
||||
}
|
||||
i = nextInsertion()
|
||||
} else {
|
||||
out = append(out, t)
|
||||
offset += len(t.Value)
|
||||
t = nextToken()
|
||||
}
|
||||
}
|
||||
return Literator(out...), nil
|
||||
}
|
||||
|
||||
func splitToken(t Token, offset int) (l Token, r Token) {
|
||||
if t == EOF {
|
||||
return EOF, EOF
|
||||
}
|
||||
if offset == 0 {
|
||||
return EOF, t
|
||||
}
|
||||
if offset == len(t.Value) {
|
||||
return t, EOF
|
||||
}
|
||||
l = t.Clone()
|
||||
r = t.Clone()
|
||||
l.Value = l.Value[:offset]
|
||||
r.Value = r.Value[offset:]
|
||||
return
|
||||
}
|
2
vendor/github.com/alecthomas/chroma/formatters/api.go
generated
vendored
2
vendor/github.com/alecthomas/chroma/formatters/api.go
generated
vendored
@@ -11,7 +11,7 @@ import (
|
||||
var (
|
||||
// NoOp formatter.
|
||||
NoOp = Register("noop", chroma.FormatterFunc(func(w io.Writer, s *chroma.Style, iterator chroma.Iterator) error {
|
||||
for t := iterator(); t != nil; t = iterator() {
|
||||
for t := iterator(); t != chroma.EOF; t = iterator() {
|
||||
if _, err := io.WriteString(w, t.Value); err != nil {
|
||||
return err
|
||||
}
|
||||
|
67
vendor/github.com/alecthomas/chroma/formatters/html/html.go
generated
vendored
67
vendor/github.com/alecthomas/chroma/formatters/html/html.go
generated
vendored
@@ -25,6 +25,9 @@ func WithClasses() Option { return func(f *Formatter) { f.Classes = true } }
|
||||
// TabWidth sets the number of characters for a tab. Defaults to 8.
|
||||
func TabWidth(width int) Option { return func(f *Formatter) { f.tabWidth = width } }
|
||||
|
||||
// PreventSurroundingPre prevents the surrounding pre tags around the generated code
|
||||
func PreventSurroundingPre() Option { return func(f *Formatter) { f.preventSurroundingPre = true } }
|
||||
|
||||
// WithLineNumbers formats output with line numbers.
|
||||
func WithLineNumbers() Option {
|
||||
return func(f *Formatter) {
|
||||
@@ -70,14 +73,15 @@ func New(options ...Option) *Formatter {
|
||||
|
||||
// Formatter that generates HTML.
|
||||
type Formatter struct {
|
||||
standalone bool
|
||||
prefix string
|
||||
Classes bool // Exported field to detect when classes are being used
|
||||
tabWidth int
|
||||
lineNumbers bool
|
||||
lineNumbersInTable bool
|
||||
highlightRanges highlightRanges
|
||||
baseLineNumber int
|
||||
standalone bool
|
||||
prefix string
|
||||
Classes bool // Exported field to detect when classes are being used
|
||||
preventSurroundingPre bool
|
||||
tabWidth int
|
||||
lineNumbers bool
|
||||
lineNumbersInTable bool
|
||||
highlightRanges highlightRanges
|
||||
baseLineNumber int
|
||||
}
|
||||
|
||||
type highlightRanges [][2]int
|
||||
@@ -125,7 +129,7 @@ func (f *Formatter) restyle(style *chroma.Style) (*chroma.Style, error) {
|
||||
// We deliberately don't use html/template here because it is two orders of magnitude slower (benchmarked).
|
||||
//
|
||||
// OTOH we need to be super careful about correct escaping...
|
||||
func (f *Formatter) writeHTML(w io.Writer, style *chroma.Style, tokens []*chroma.Token) (err error) { // nolint: gocyclo
|
||||
func (f *Formatter) writeHTML(w io.Writer, style *chroma.Style, tokens []chroma.Token) (err error) { // nolint: gocyclo
|
||||
style, err = f.restyle(style)
|
||||
if err != nil {
|
||||
return err
|
||||
@@ -149,7 +153,7 @@ func (f *Formatter) writeHTML(w io.Writer, style *chroma.Style, tokens []*chroma
|
||||
|
||||
wrapInTable := f.lineNumbers && f.lineNumbersInTable
|
||||
|
||||
lines := splitTokensIntoLines(tokens)
|
||||
lines := chroma.SplitTokensIntoLines(tokens)
|
||||
lineDigits := len(fmt.Sprintf("%d", len(lines)))
|
||||
highlightIndex := 0
|
||||
|
||||
@@ -158,7 +162,9 @@ func (f *Formatter) writeHTML(w io.Writer, style *chroma.Style, tokens []*chroma
|
||||
fmt.Fprintf(w, "<div%s>\n", f.styleAttr(css, chroma.Background))
|
||||
fmt.Fprintf(w, "<table%s><tr>", f.styleAttr(css, chroma.LineTable))
|
||||
fmt.Fprintf(w, "<td%s>\n", f.styleAttr(css, chroma.LineTableTD))
|
||||
fmt.Fprintf(w, "<pre%s>", f.styleAttr(css, chroma.Background))
|
||||
if !f.preventSurroundingPre {
|
||||
fmt.Fprintf(w, "<pre%s>", f.styleAttr(css, chroma.Background))
|
||||
}
|
||||
for index := range lines {
|
||||
line := f.baseLineNumber + index
|
||||
highlight, next := f.shouldHighlight(highlightIndex, line)
|
||||
@@ -175,11 +181,16 @@ func (f *Formatter) writeHTML(w io.Writer, style *chroma.Style, tokens []*chroma
|
||||
fmt.Fprintf(w, "</span>")
|
||||
}
|
||||
}
|
||||
fmt.Fprint(w, "</pre></td>\n")
|
||||
if !f.preventSurroundingPre {
|
||||
fmt.Fprint(w, "</pre>")
|
||||
}
|
||||
fmt.Fprint(w, "</td>\n")
|
||||
fmt.Fprintf(w, "<td%s>\n", f.styleAttr(css, chroma.LineTableTD))
|
||||
}
|
||||
|
||||
fmt.Fprintf(w, "<pre%s>", f.styleAttr(css, chroma.Background))
|
||||
if !f.preventSurroundingPre {
|
||||
fmt.Fprintf(w, "<pre%s>", f.styleAttr(css, chroma.Background))
|
||||
}
|
||||
highlightIndex = 0
|
||||
for index, tokens := range lines {
|
||||
// 1-based line number.
|
||||
@@ -209,7 +220,9 @@ func (f *Formatter) writeHTML(w io.Writer, style *chroma.Style, tokens []*chroma
|
||||
}
|
||||
}
|
||||
|
||||
fmt.Fprint(w, "</pre>")
|
||||
if !f.preventSurroundingPre {
|
||||
fmt.Fprint(w, "</pre>")
|
||||
}
|
||||
|
||||
if wrapInTable {
|
||||
fmt.Fprint(w, "</td></tr></table>\n")
|
||||
@@ -354,6 +367,9 @@ func StyleEntryToCSS(e chroma.StyleEntry) string {
|
||||
if e.Italic == chroma.Yes {
|
||||
styles = append(styles, "font-style: italic")
|
||||
}
|
||||
if e.Underline == chroma.Yes {
|
||||
styles = append(styles, "text-decoration: underline")
|
||||
}
|
||||
return strings.Join(styles, "; ")
|
||||
}
|
||||
|
||||
@@ -374,26 +390,3 @@ func compressStyle(s string) string {
|
||||
}
|
||||
return strings.Join(out, ";")
|
||||
}
|
||||
|
||||
func splitTokensIntoLines(tokens []*chroma.Token) (out [][]*chroma.Token) {
|
||||
line := []*chroma.Token{}
|
||||
for _, token := range tokens {
|
||||
for strings.Contains(token.Value, "\n") {
|
||||
parts := strings.SplitAfterN(token.Value, "\n", 2)
|
||||
// Token becomes the tail.
|
||||
token.Value = parts[1]
|
||||
|
||||
// Append the head to the line and flush the line.
|
||||
clone := token.Clone()
|
||||
clone.Value = parts[0]
|
||||
line = append(line, clone)
|
||||
out = append(out, line)
|
||||
line = nil
|
||||
}
|
||||
line = append(line, token)
|
||||
}
|
||||
if len(line) > 0 {
|
||||
out = append(out, line)
|
||||
}
|
||||
return
|
||||
}
|
||||
|
2
vendor/github.com/alecthomas/chroma/formatters/json.go
generated
vendored
2
vendor/github.com/alecthomas/chroma/formatters/json.go
generated
vendored
@@ -12,7 +12,7 @@ import (
|
||||
var JSON = Register("json", chroma.FormatterFunc(func(w io.Writer, s *chroma.Style, it chroma.Iterator) error {
|
||||
fmt.Fprintln(w, "[")
|
||||
i := 0
|
||||
for t := it(); t != nil; t = it() {
|
||||
for t := it(); t != chroma.EOF; t = it() {
|
||||
if i > 0 {
|
||||
fmt.Fprintln(w, ",")
|
||||
}
|
||||
|
2
vendor/github.com/alecthomas/chroma/formatters/tokens.go
generated
vendored
2
vendor/github.com/alecthomas/chroma/formatters/tokens.go
generated
vendored
@@ -9,7 +9,7 @@ import (
|
||||
|
||||
// Tokens formatter outputs the raw token structures.
|
||||
var Tokens = Register("tokens", chroma.FormatterFunc(func(w io.Writer, s *chroma.Style, it chroma.Iterator) error {
|
||||
for t := it(); t != nil; t = it() {
|
||||
for t := it(); t != chroma.EOF; t = it() {
|
||||
if _, err := fmt.Fprintln(w, t.GoString()); err != nil {
|
||||
return err
|
||||
}
|
||||
|
2
vendor/github.com/alecthomas/chroma/formatters/tty_indexed.go
generated
vendored
2
vendor/github.com/alecthomas/chroma/formatters/tty_indexed.go
generated
vendored
@@ -216,7 +216,7 @@ func (c *indexedTTYFormatter) Format(w io.Writer, style *chroma.Style, it chroma
|
||||
}
|
||||
}()
|
||||
theme := styleToEscapeSequence(c.table, style)
|
||||
for token := it(); token != nil; token = it() {
|
||||
for token := it(); token != chroma.EOF; token = it() {
|
||||
// TODO: Cache token lookups?
|
||||
clr, ok := theme[token.Type]
|
||||
if !ok {
|
||||
|
2
vendor/github.com/alecthomas/chroma/formatters/tty_truecolour.go
generated
vendored
2
vendor/github.com/alecthomas/chroma/formatters/tty_truecolour.go
generated
vendored
@@ -11,7 +11,7 @@ import (
|
||||
var TTY16m = Register("terminal16m", chroma.FormatterFunc(trueColourFormatter))
|
||||
|
||||
func trueColourFormatter(w io.Writer, style *chroma.Style, it chroma.Iterator) error {
|
||||
for token := it(); token != nil; token = it() {
|
||||
for token := it(); token != chroma.EOF; token = it() {
|
||||
entry := style.Get(token.Type)
|
||||
if !entry.IsZero() {
|
||||
out := ""
|
||||
|
14
vendor/github.com/alecthomas/chroma/go.mod
generated
vendored
Normal file
14
vendor/github.com/alecthomas/chroma/go.mod
generated
vendored
Normal file
@@ -0,0 +1,14 @@
|
||||
module github.com/alecthomas/chroma
|
||||
|
||||
require (
|
||||
github.com/alecthomas/assert v0.0.0-20170929043011-405dbfeb8e38
|
||||
github.com/alecthomas/colour v0.0.0-20160524082231-60882d9e2721 // indirect
|
||||
github.com/alecthomas/kong v0.1.15
|
||||
github.com/alecthomas/repr v0.0.0-20180818092828-117648cd9897 // indirect
|
||||
github.com/danwakefield/fnmatch v0.0.0-20160403171240-cbb64ac3d964
|
||||
github.com/dlclark/regexp2 v1.1.6
|
||||
github.com/mattn/go-colorable v0.0.9
|
||||
github.com/mattn/go-isatty v0.0.4
|
||||
github.com/sergi/go-diff v1.0.0 // indirect
|
||||
golang.org/x/sys v0.0.0-20181128092732-4ed8d59d0b35 // indirect
|
||||
)
|
26
vendor/github.com/alecthomas/chroma/go.sum
generated
vendored
Normal file
26
vendor/github.com/alecthomas/chroma/go.sum
generated
vendored
Normal file
@@ -0,0 +1,26 @@
|
||||
github.com/alecthomas/assert v0.0.0-20170929043011-405dbfeb8e38 h1:smF2tmSOzy2Mm+0dGI2AIUHY+w0BUc+4tn40djz7+6U=
|
||||
github.com/alecthomas/assert v0.0.0-20170929043011-405dbfeb8e38/go.mod h1:r7bzyVFMNntcxPZXK3/+KdruV1H5KSlyVY0gc+NgInI=
|
||||
github.com/alecthomas/colour v0.0.0-20160524082231-60882d9e2721 h1:JHZL0hZKJ1VENNfmXvHbgYlbUOvpzYzvy2aZU5gXVeo=
|
||||
github.com/alecthomas/colour v0.0.0-20160524082231-60882d9e2721/go.mod h1:QO9JBoKquHd+jz9nshCh40fOfO+JzsoXy8qTHF68zU0=
|
||||
github.com/alecthomas/kong v0.1.15 h1:IWBg+KrLvoHBicD50OzMI8fKjrtAa1okMR9g38HVM/s=
|
||||
github.com/alecthomas/kong v0.1.15/go.mod h1:0m2VYms8rH0qbCqVB2gvGHk74bqLIq0HXjCs5bNbNQU=
|
||||
github.com/alecthomas/repr v0.0.0-20180818092828-117648cd9897 h1:p9Sln00KOTlrYkxI1zYWl1QLnEqAqEARBEYa8FQnQcY=
|
||||
github.com/alecthomas/repr v0.0.0-20180818092828-117648cd9897/go.mod h1:xTS7Pm1pD1mvyM075QCDSRqH6qRLXylzS24ZTpRiSzQ=
|
||||
github.com/danwakefield/fnmatch v0.0.0-20160403171240-cbb64ac3d964 h1:y5HC9v93H5EPKqaS1UYVg1uYah5Xf51mBfIoWehClUQ=
|
||||
github.com/danwakefield/fnmatch v0.0.0-20160403171240-cbb64ac3d964/go.mod h1:Xd9hchkHSWYkEqJwUGisez3G1QY8Ryz0sdWrLPMGjLk=
|
||||
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
|
||||
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
|
||||
github.com/dlclark/regexp2 v1.1.6 h1:CqB4MjHw0MFCDj+PHHjiESmHX+N7t0tJzKvC6M97BRg=
|
||||
github.com/dlclark/regexp2 v1.1.6/go.mod h1:2pZnwuY/m+8K6iRw6wQdMtk+rH5tNGR1i55kozfMjCc=
|
||||
github.com/mattn/go-colorable v0.0.9 h1:UVL0vNpWh04HeJXV0KLcaT7r06gOH2l4OW6ddYRUIY4=
|
||||
github.com/mattn/go-colorable v0.0.9/go.mod h1:9vuHe8Xs5qXnSaW/c/ABM9alt+Vo+STaOChaDxuIBZU=
|
||||
github.com/mattn/go-isatty v0.0.4 h1:bnP0vzxcAdeI1zdubAl5PjU6zsERjGZb7raWodagDYs=
|
||||
github.com/mattn/go-isatty v0.0.4/go.mod h1:M+lRXTBqGeGNdLjl/ufCoiOlB5xdOkqRJdNxMWT7Zi4=
|
||||
github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM=
|
||||
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
|
||||
github.com/sergi/go-diff v1.0.0 h1:Kpca3qRNrduNnOQeazBd0ysaKrUJiIuISHxogkT9RPQ=
|
||||
github.com/sergi/go-diff v1.0.0/go.mod h1:0CfEIISq7TuYL3j771MWULgwwjU+GofnZX9QAmXWZgo=
|
||||
github.com/stretchr/testify v1.2.2 h1:bSDNvY7ZPG5RlJ8otE/7V6gMiyenm9RtJ7IUVIAoJ1w=
|
||||
github.com/stretchr/testify v1.2.2/go.mod h1:a8OnRcib4nhh0OaRAV+Yts87kKdq0PP7pXfy6kDkUVs=
|
||||
golang.org/x/sys v0.0.0-20181128092732-4ed8d59d0b35 h1:YAFjXN64LMvktoUZH9zgY4lGc/msGN7HQfoSuKCgaDU=
|
||||
golang.org/x/sys v0.0.0-20181128092732-4ed8d59d0b35/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
|
52
vendor/github.com/alecthomas/chroma/iterator.go
generated
vendored
52
vendor/github.com/alecthomas/chroma/iterator.go
generated
vendored
@@ -1,16 +1,18 @@
|
||||
package chroma
|
||||
|
||||
import "strings"
|
||||
|
||||
// An Iterator across tokens.
|
||||
//
|
||||
// nil will be returned at the end of the Token stream.
|
||||
//
|
||||
// If an error occurs within an Iterator, it may propagate this in a panic. Formatters should recover.
|
||||
type Iterator func() *Token
|
||||
type Iterator func() Token
|
||||
|
||||
// Tokens consumes all tokens from the iterator and returns them as a slice.
|
||||
func (i Iterator) Tokens() []*Token {
|
||||
out := []*Token{}
|
||||
for t := i(); t != nil; t = i() {
|
||||
func (i Iterator) Tokens() []Token {
|
||||
var out []Token
|
||||
for t := i(); t != EOF; t = i() {
|
||||
out = append(out, t)
|
||||
}
|
||||
return out
|
||||
@@ -18,26 +20,56 @@ func (i Iterator) Tokens() []*Token {
|
||||
|
||||
// Concaterator concatenates tokens from a series of iterators.
|
||||
func Concaterator(iterators ...Iterator) Iterator {
|
||||
return func() *Token {
|
||||
return func() Token {
|
||||
for len(iterators) > 0 {
|
||||
t := iterators[0]()
|
||||
if t != nil {
|
||||
if t != EOF {
|
||||
return t
|
||||
}
|
||||
iterators = iterators[1:]
|
||||
}
|
||||
return nil
|
||||
return EOF
|
||||
}
|
||||
}
|
||||
|
||||
// Literator converts a sequence of literal Tokens into an Iterator.
|
||||
func Literator(tokens ...*Token) Iterator {
|
||||
return func() (out *Token) {
|
||||
func Literator(tokens ...Token) Iterator {
|
||||
return func() Token {
|
||||
if len(tokens) == 0 {
|
||||
return nil
|
||||
return EOF
|
||||
}
|
||||
token := tokens[0]
|
||||
tokens = tokens[1:]
|
||||
return token
|
||||
}
|
||||
}
|
||||
|
||||
func SplitTokensIntoLines(tokens []Token) (out [][]Token) {
|
||||
var line []Token
|
||||
for _, token := range tokens {
|
||||
for strings.Contains(token.Value, "\n") {
|
||||
parts := strings.SplitAfterN(token.Value, "\n", 2)
|
||||
// Token becomes the tail.
|
||||
token.Value = parts[1]
|
||||
|
||||
// Append the head to the line and flush the line.
|
||||
clone := token.Clone()
|
||||
clone.Value = parts[0]
|
||||
line = append(line, clone)
|
||||
out = append(out, line)
|
||||
line = nil
|
||||
}
|
||||
line = append(line, token)
|
||||
}
|
||||
if len(line) > 0 {
|
||||
out = append(out, line)
|
||||
}
|
||||
// Strip empty trailing token line.
|
||||
if len(out) > 0 {
|
||||
last := out[len(out)-1]
|
||||
if len(last) == 1 && last[0].Value == "" {
|
||||
out = out[:len(out)-1]
|
||||
}
|
||||
}
|
||||
return
|
||||
}
|
||||
|
10
vendor/github.com/alecthomas/chroma/lexer.go
generated
vendored
10
vendor/github.com/alecthomas/chroma/lexer.go
generated
vendored
@@ -64,14 +64,14 @@ type Token struct {
|
||||
}
|
||||
|
||||
func (t *Token) String() string { return t.Value }
|
||||
func (t *Token) GoString() string { return fmt.Sprintf("Token{%s, %q}", t.Type, t.Value) }
|
||||
func (t *Token) GoString() string { return fmt.Sprintf("&Token{%s, %q}", t.Type, t.Value) }
|
||||
|
||||
func (t *Token) Clone() *Token {
|
||||
clone := &Token{}
|
||||
*clone = *t
|
||||
return clone
|
||||
func (t *Token) Clone() Token {
|
||||
return *t
|
||||
}
|
||||
|
||||
var EOF Token
|
||||
|
||||
type TokeniseOptions struct {
|
||||
// State to start tokenisation in. Defaults to "root".
|
||||
State string
|
||||
|
46
vendor/github.com/alecthomas/chroma/lexers/b/ballerina.go
generated
vendored
Normal file
46
vendor/github.com/alecthomas/chroma/lexers/b/ballerina.go
generated
vendored
Normal file
@@ -0,0 +1,46 @@
|
||||
package b
|
||||
|
||||
import (
|
||||
. "github.com/alecthomas/chroma" // nolint
|
||||
"github.com/alecthomas/chroma/lexers/internal"
|
||||
)
|
||||
|
||||
// Ballerina lexer.
|
||||
var Ballerina = internal.Register(MustNewLexer(
|
||||
&Config{
|
||||
Name: "Ballerina",
|
||||
Aliases: []string{"ballerina"},
|
||||
Filenames: []string{"*.bal"},
|
||||
MimeTypes: []string{"text/x-ballerina"},
|
||||
DotAll: true,
|
||||
},
|
||||
Rules{
|
||||
"root": {
|
||||
{`[^\S\n]+`, Text, nil},
|
||||
{`//.*?\n`, CommentSingle, nil},
|
||||
{`/\*.*?\*/`, CommentMultiline, nil},
|
||||
{`(break|catch|continue|done|else|finally|foreach|forever|fork|if|lock|match|return|throw|transaction|try|while)\b`, Keyword, nil},
|
||||
{`((?:(?:[^\W\d]|\$)[\w.\[\]$<>]*\s+)+?)((?:[^\W\d]|\$)[\w$]*)(\s*)(\()`, ByGroups(UsingSelf("root"), NameFunction, Text, Operator), nil},
|
||||
{`@[^\W\d][\w.]*`, NameDecorator, nil},
|
||||
{`(annotation|bind|but|endpoint|error|function|object|private|public|returns|service|type|var|with|worker)\b`, KeywordDeclaration, nil},
|
||||
{`(boolean|byte|decimal|float|int|json|map|nil|record|string|table|xml)\b`, KeywordType, nil},
|
||||
{`(true|false|null)\b`, KeywordConstant, nil},
|
||||
{`import(\s+)`, ByGroups(KeywordNamespace, Text), Push("import")},
|
||||
{`"(\\\\|\\"|[^"])*"`, LiteralString, nil},
|
||||
{`'\\.'|'[^\\]'|'\\u[0-9a-fA-F]{4}'`, LiteralStringChar, nil},
|
||||
{`(\.)((?:[^\W\d]|\$)[\w$]*)`, ByGroups(Operator, NameAttribute), nil},
|
||||
{`^\s*([^\W\d]|\$)[\w$]*:`, NameLabel, nil},
|
||||
{`([^\W\d]|\$)[\w$]*`, Name, nil},
|
||||
{`([0-9][0-9_]*\.([0-9][0-9_]*)?|\.[0-9][0-9_]*)([eE][+\-]?[0-9][0-9_]*)?[fFdD]?|[0-9][eE][+\-]?[0-9][0-9_]*[fFdD]?|[0-9]([eE][+\-]?[0-9][0-9_]*)?[fFdD]|0[xX]([0-9a-fA-F][0-9a-fA-F_]*\.?|([0-9a-fA-F][0-9a-fA-F_]*)?\.[0-9a-fA-F][0-9a-fA-F_]*)[pP][+\-]?[0-9][0-9_]*[fFdD]?`, LiteralNumberFloat, nil},
|
||||
{`0[xX][0-9a-fA-F][0-9a-fA-F_]*[lL]?`, LiteralNumberHex, nil},
|
||||
{`0[bB][01][01_]*[lL]?`, LiteralNumberBin, nil},
|
||||
{`0[0-7_]+[lL]?`, LiteralNumberOct, nil},
|
||||
{`0|[1-9][0-9_]*[lL]?`, LiteralNumberInteger, nil},
|
||||
{`[~^*!%&\[\](){}<>|+=:;,./?-]`, Operator, nil},
|
||||
{`\n`, Text, nil},
|
||||
},
|
||||
"import": {
|
||||
{`[\w.]+`, NameNamespace, Pop(1)},
|
||||
},
|
||||
},
|
||||
))
|
2
vendor/github.com/alecthomas/chroma/lexers/b/bash.go
generated
vendored
2
vendor/github.com/alecthomas/chroma/lexers/b/bash.go
generated
vendored
@@ -36,7 +36,7 @@ var Bash = internal.Register(MustNewLexer(
|
||||
{`\b(if|fi|else|while|do|done|for|then|return|function|case|select|continue|until|esac|elif)(\s*)\b`, ByGroups(Keyword, Text), nil},
|
||||
{"\\b(alias|bg|bind|break|builtin|caller|cd|command|compgen|complete|declare|dirs|disown|echo|enable|eval|exec|exit|export|false|fc|fg|getopts|hash|help|history|jobs|kill|let|local|logout|popd|printf|pushd|pwd|read|readonly|set|shift|shopt|source|suspend|test|time|times|trap|true|type|typeset|ulimit|umask|unalias|unset|wait)(?=[\\s)`])", NameBuiltin, nil},
|
||||
{`\A#!.+\n`, CommentPreproc, nil},
|
||||
{`#.*\n`, CommentSingle, nil},
|
||||
{`#.*\S`, CommentSingle, nil},
|
||||
{`\\[\w\W]`, LiteralStringEscape, nil},
|
||||
{`(\b\w+)(\s*)(\+?=)`, ByGroups(NameVariable, Text, Operator), nil},
|
||||
{`[\[\]{}()=]`, Operator, nil},
|
||||
|
2
vendor/github.com/alecthomas/chroma/lexers/c/cpp.go
generated
vendored
2
vendor/github.com/alecthomas/chroma/lexers/c/cpp.go
generated
vendored
@@ -18,7 +18,7 @@ var CPP = internal.Register(MustNewLexer(
|
||||
"statements": {
|
||||
{Words(``, `\b`, `catch`, `const_cast`, `delete`, `dynamic_cast`, `explicit`, `export`, `friend`, `mutable`, `namespace`, `new`, `operator`, `private`, `protected`, `public`, `reinterpret_cast`, `restrict`, `static_cast`, `template`, `this`, `throw`, `throws`, `try`, `typeid`, `typename`, `using`, `virtual`, `constexpr`, `nullptr`, `decltype`, `thread_local`, `alignas`, `alignof`, `static_assert`, `noexcept`, `override`, `final`), Keyword, nil},
|
||||
{`char(16_t|32_t)\b`, KeywordType, nil},
|
||||
{`(class)(\s+)`, ByGroups(Keyword, Text), Push("classname")},
|
||||
{`(class)\b`, ByGroups(Keyword, Text), Push("classname")},
|
||||
{`(R)(")([^\\()\s]{,16})(\()((?:.|\n)*?)(\)\3)(")`, ByGroups(LiteralStringAffix, LiteralString, LiteralStringDelimiter, LiteralStringDelimiter, LiteralString, LiteralStringDelimiter, LiteralString), nil},
|
||||
{`(u8|u|U)(")`, ByGroups(LiteralStringAffix, LiteralString), Push("string")},
|
||||
{`(L?)(")`, ByGroups(LiteralStringAffix, LiteralString), Push("string")},
|
||||
|
69
vendor/github.com/alecthomas/chroma/lexers/c/cql.go
generated
vendored
Normal file
69
vendor/github.com/alecthomas/chroma/lexers/c/cql.go
generated
vendored
Normal file
@@ -0,0 +1,69 @@
|
||||
package c
|
||||
|
||||
import (
|
||||
. "github.com/alecthomas/chroma" // nolint
|
||||
"github.com/alecthomas/chroma/lexers/internal"
|
||||
)
|
||||
|
||||
// CassandraCQL lexer.
|
||||
var CassandraCQL = internal.Register(MustNewLexer(
|
||||
&Config{
|
||||
Name: "Cassandra CQL",
|
||||
Aliases: []string{"cassandra", "cql"},
|
||||
Filenames: []string{"*.cql"},
|
||||
MimeTypes: []string{"text/x-cql"},
|
||||
NotMultiline: true,
|
||||
CaseInsensitive: true,
|
||||
},
|
||||
Rules{
|
||||
"root": {
|
||||
{`\s+`, TextWhitespace, nil},
|
||||
{`(--|\/\/).*\n?`, CommentSingle, nil},
|
||||
{`/\*`, CommentMultiline, Push("multiline-comments")},
|
||||
{`(ascii|bigint|blob|boolean|counter|date|decimal|double|float|frozen|inet|int|list|map|set|smallint|text|time|timestamp|timeuuid|tinyint|tuple|uuid|varchar|varint)\b`, NameBuiltin, nil},
|
||||
{Words(``, `\b`, `ADD`, `AGGREGATE`, `ALL`, `ALLOW`, `ALTER`, `AND`, `ANY`, `APPLY`, `AS`, `ASC`, `AUTHORIZE`, `BATCH`, `BEGIN`, `BY`, `CLUSTERING`, `COLUMNFAMILY`, `COMPACT`, `CONSISTENCY`, `COUNT`, `CREATE`, `CUSTOM`, `DELETE`, `DESC`, `DISTINCT`, `DROP`, `EACH_QUORUM`, `ENTRIES`, `EXISTS`, `FILTERING`, `FROM`, `FULL`, `GRANT`, `IF`, `IN`, `INDEX`, `INFINITY`, `INSERT`, `INTO`, `KEY`, `KEYS`, `KEYSPACE`, `KEYSPACES`, `LEVEL`, `LIMIT`, `LOCAL_ONE`, `LOCAL_QUORUM`, `MATERIALIZED`, `MODIFY`, `NAN`, `NORECURSIVE`, `NOSUPERUSER`, `NOT`, `OF`, `ON`, `ONE`, `ORDER`, `PARTITION`, `PASSWORD`, `PER`, `PERMISSION`, `PERMISSIONS`, `PRIMARY`, `QUORUM`, `RENAME`, `REVOKE`, `SCHEMA`, `SELECT`, `STATIC`, `STORAGE`, `SUPERUSER`, `TABLE`, `THREE`, `TO`, `TOKEN`, `TRUNCATE`, `TTL`, `TWO`, `TYPE`, `UNLOGGED`, `UPDATE`, `USE`, `USER`, `USERS`, `USING`, `VALUES`, `VIEW`, `WHERE`, `WITH`, `WRITETIME`, `REPLICATION`, `OR`, `REPLACE`, `FUNCTION`, `CALLED`, `INPUT`, `RETURNS`, `LANGUAGE`, `ROLE`, `ROLES`, `TRIGGER`, `DURABLE_WRITES`, `LOGIN`, `OPTIONS`, `LOGGED`, `SFUNC`, `STYPE`, `FINALFUNC`, `INITCOND`, `IS`, `CONTAINS`, `JSON`, `PAGING`, `OFF`), Keyword, nil},
|
||||
{"[+*/<>=~!@#%^&|`?-]+", Operator, nil},
|
||||
{`(?s)(java|javascript)(\s+)(AS)(\s+)('|\$\$)(.*?)(\5)`,
|
||||
UsingByGroup(
|
||||
internal.Get,
|
||||
1, 6,
|
||||
NameBuiltin, TextWhitespace, Keyword, TextWhitespace,
|
||||
LiteralStringHeredoc, LiteralStringHeredoc, LiteralStringHeredoc,
|
||||
),
|
||||
nil,
|
||||
},
|
||||
{`(true|false|null)\b`, KeywordConstant, nil},
|
||||
{`0x[0-9a-f]+`, LiteralNumberHex, nil},
|
||||
{`[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}`, LiteralNumberHex, nil},
|
||||
{`\.[0-9]+(e[+-]?[0-9]+)?`, Error, nil},
|
||||
{`-?[0-9]+(\.[0-9])?(e[+-]?[0-9]+)?`, LiteralNumberFloat, nil},
|
||||
{`[0-9]+`, LiteralNumberInteger, nil},
|
||||
{`'`, LiteralStringSingle, Push("string")},
|
||||
{`"`, LiteralStringName, Push("quoted-ident")},
|
||||
{`\$\$`, LiteralStringHeredoc, Push("dollar-string")},
|
||||
{`[a-z_]\w*`, Name, nil},
|
||||
{`:(['"]?)[a-z]\w*\b\1`, NameVariable, nil},
|
||||
{`[;:()\[\]\{\},.]`, Punctuation, nil},
|
||||
},
|
||||
"multiline-comments": {
|
||||
{`/\*`, CommentMultiline, Push("multiline-comments")},
|
||||
{`\*/`, CommentMultiline, Pop(1)},
|
||||
{`[^/*]+`, CommentMultiline, nil},
|
||||
{`[/*]`, CommentMultiline, nil},
|
||||
},
|
||||
"string": {
|
||||
{`[^']+`, LiteralStringSingle, nil},
|
||||
{`''`, LiteralStringSingle, nil},
|
||||
{`'`, LiteralStringSingle, Pop(1)},
|
||||
},
|
||||
"quoted-ident": {
|
||||
{`[^"]+`, LiteralStringName, nil},
|
||||
{`""`, LiteralStringName, nil},
|
||||
{`"`, LiteralStringName, Pop(1)},
|
||||
},
|
||||
"dollar-string": {
|
||||
{`[^\$]+`, LiteralStringHeredoc, nil},
|
||||
{`\$\$`, LiteralStringHeredoc, Pop(1)},
|
||||
},
|
||||
},
|
||||
))
|
1
vendor/github.com/alecthomas/chroma/lexers/c/csharp.go
generated
vendored
1
vendor/github.com/alecthomas/chroma/lexers/c/csharp.go
generated
vendored
@@ -35,7 +35,6 @@ var CSharp = internal.Register(MustNewLexer(
|
||||
{`(global)(::)`, ByGroups(Keyword, Punctuation), nil},
|
||||
{`(bool|byte|char|decimal|double|dynamic|float|int|long|object|sbyte|short|string|uint|ulong|ushort|var)\b\??`, KeywordType, nil},
|
||||
{`(class|struct)(\s+)`, ByGroups(Keyword, Text), Push("class")},
|
||||
{`\b([_a-zA-Z]\w*)(\.)`, ByGroups(NameClass, Punctuation), nil},
|
||||
{`(namespace|using)(\s+)`, ByGroups(Keyword, Text), Push("namespace")},
|
||||
{`@?[_a-zA-Z]\w*`, Name, nil},
|
||||
},
|
||||
|
2
vendor/github.com/alecthomas/chroma/lexers/circular/doc.go
generated
vendored
Normal file
2
vendor/github.com/alecthomas/chroma/lexers/circular/doc.go
generated
vendored
Normal file
@@ -0,0 +1,2 @@
|
||||
// Package circular exists to break circular dependencies between lexers.
|
||||
package circular
|
@@ -1,12 +1,15 @@
|
||||
package p
|
||||
package circular
|
||||
|
||||
import (
|
||||
"strings"
|
||||
|
||||
. "github.com/alecthomas/chroma" // nolint
|
||||
"github.com/alecthomas/chroma/lexers/h"
|
||||
"github.com/alecthomas/chroma/lexers/internal"
|
||||
)
|
||||
|
||||
// PHP lexer.
|
||||
var PHP = internal.Register(MustNewLexer(
|
||||
var PHP = internal.Register(DelegatingLexer(h.HTML, MustNewLexer(
|
||||
&Config{
|
||||
Name: "PHP",
|
||||
Aliases: []string{"php", "php3", "php4", "php5"},
|
||||
@@ -80,4 +83,9 @@ var PHP = internal.Register(MustNewLexer(
|
||||
{`[${\\]`, LiteralStringDouble, nil},
|
||||
},
|
||||
},
|
||||
))
|
||||
).SetAnalyser(func(text string) float32 {
|
||||
if strings.Contains(text, "<?php") {
|
||||
return 0.5
|
||||
}
|
||||
return 0.0
|
||||
})))
|
6
vendor/github.com/alecthomas/chroma/lexers/e/elixir.go
generated
vendored
6
vendor/github.com/alecthomas/chroma/lexers/e/elixir.go
generated
vendored
@@ -36,9 +36,9 @@ var Elixir = internal.Register(MustNewLexer(
|
||||
{`\\\\|\<\<|\>\>|\=\>|\(|\)|\:|\;|\,|\[|\]`, Punctuation, nil},
|
||||
{`&\d`, NameEntity, nil},
|
||||
{`\<|\>|\+|\-|\*|\/|\!|\^|\&`, Operator, nil},
|
||||
{`0b[01]+`, LiteralNumberBin, nil},
|
||||
{`0o[0-7]+`, LiteralNumberOct, nil},
|
||||
{`0x[\da-fA-F]+`, LiteralNumberHex, nil},
|
||||
{`0b[01](_?[01])*`, LiteralNumberBin, nil},
|
||||
{`0o[0-7](_?[0-7])*`, LiteralNumberOct, nil},
|
||||
{`0x[\da-fA-F](_?[\dA-Fa-f])*`, LiteralNumberHex, nil},
|
||||
{`\d(_?\d)*\.\d(_?\d)*([eE][-+]?\d(_?\d)*)?`, LiteralNumberFloat, nil},
|
||||
{`\d(_?\d)*`, LiteralNumberInteger, nil},
|
||||
{`"""\s*`, LiteralStringHeredoc, Push("heredoc_double")},
|
||||
|
60
vendor/github.com/alecthomas/chroma/lexers/g/go.go
generated
vendored
60
vendor/github.com/alecthomas/chroma/lexers/g/go.go
generated
vendored
@@ -4,6 +4,7 @@ import (
|
||||
"strings"
|
||||
|
||||
. "github.com/alecthomas/chroma" // nolint
|
||||
"github.com/alecthomas/chroma/lexers/h"
|
||||
"github.com/alecthomas/chroma/lexers/internal"
|
||||
)
|
||||
|
||||
@@ -38,9 +39,10 @@ var Go = internal.Register(MustNewLexer(
|
||||
{`0[xX][0-9a-fA-F]+`, LiteralNumberHex, nil},
|
||||
{`(0|[1-9][0-9]*)`, LiteralNumberInteger, nil},
|
||||
{`'(\\['"\\abfnrtv]|\\x[0-9a-fA-F]{2}|\\[0-7]{1,3}|\\u[0-9a-fA-F]{4}|\\U[0-9a-fA-F]{8}|[^\\])'`, LiteralStringChar, nil},
|
||||
{"`[^`]*`", LiteralString, nil},
|
||||
{"(`)([^`]*)(`)", ByGroups(LiteralString, Using(TypeRemappingLexer(GoTextTemplate, TypeMapping{{Other, LiteralString, nil}})), LiteralString), nil},
|
||||
{`"(\\\\|\\"|[^"])*"`, LiteralString, nil},
|
||||
{`(<<=|>>=|<<|>>|<=|>=|&\^=|&\^|\+=|-=|\*=|/=|%=|&=|\|=|&&|\|\||<-|\+\+|--|==|!=|:=|\.\.\.|[+\-*/%&])`, Operator, nil},
|
||||
{`([a-zA-Z_]\w*)(\s*)(\()`, ByGroups(NameFunction, UsingSelf("root"), Punctuation), nil},
|
||||
{`[|^<>=!()\[\]{}.,;:]`, Punctuation, nil},
|
||||
{`[^\W\d]\w*`, NameOther, nil},
|
||||
},
|
||||
@@ -54,3 +56,59 @@ var Go = internal.Register(MustNewLexer(
|
||||
}
|
||||
return 0.0
|
||||
}))
|
||||
|
||||
var goTemplateRules = Rules{
|
||||
"root": {
|
||||
{`{{[-]?`, CommentPreproc, Push("template")},
|
||||
{`[^{]+`, Other, nil},
|
||||
{`{`, Other, nil},
|
||||
},
|
||||
"template": {
|
||||
{`[-]?}}`, CommentPreproc, Pop(1)},
|
||||
{`/\*.*?\*/`, Comment, nil},
|
||||
{`(?=}})`, CommentPreproc, Pop(1)}, // Terminate the pipeline
|
||||
{`\(`, Operator, Push("subexpression")},
|
||||
{`"(\\\\|\\"|[^"])*"`, LiteralString, nil},
|
||||
Include("expression"),
|
||||
},
|
||||
"subexpression": {
|
||||
{`\)`, Operator, Pop(1)},
|
||||
Include("expression"),
|
||||
},
|
||||
"expression": {
|
||||
{`\s+`, Whitespace, nil},
|
||||
{`\(`, Operator, Push("subexpression")},
|
||||
{`(range|if|else|while|with|template|end|true|false|nil|and|call|html|index|js|len|not|or|print|printf|println|urlquery|eq|ne|lt|le|gt|ge)\b`, Keyword, nil},
|
||||
{`\||:=`, Operator, nil},
|
||||
{`[$]?[^\W\d]\w*`, NameOther, nil},
|
||||
{`[$]?\.(?:[^\W\d]\w*)?`, NameAttribute, nil},
|
||||
{`"(\\\\|\\"|[^"])*"`, LiteralString, nil},
|
||||
{`\d+i`, LiteralNumber, nil},
|
||||
{`\d+\.\d*([Ee][-+]\d+)?i`, LiteralNumber, nil},
|
||||
{`\.\d+([Ee][-+]\d+)?i`, LiteralNumber, nil},
|
||||
{`\d+[Ee][-+]\d+i`, LiteralNumber, nil},
|
||||
{`\d+(\.\d+[eE][+\-]?\d+|\.\d*|[eE][+\-]?\d+)`, LiteralNumberFloat, nil},
|
||||
{`\.\d+([eE][+\-]?\d+)?`, LiteralNumberFloat, nil},
|
||||
{`0[0-7]+`, LiteralNumberOct, nil},
|
||||
{`0[xX][0-9a-fA-F]+`, LiteralNumberHex, nil},
|
||||
{`(0|[1-9][0-9]*)`, LiteralNumberInteger, nil},
|
||||
{`'(\\['"\\abfnrtv]|\\x[0-9a-fA-F]{2}|\\[0-7]{1,3}|\\u[0-9a-fA-F]{4}|\\U[0-9a-fA-F]{8}|[^\\])'`, LiteralStringChar, nil},
|
||||
{"`[^`]*`", LiteralString, nil},
|
||||
},
|
||||
}
|
||||
|
||||
var GoHTMLTemplate = internal.Register(DelegatingLexer(h.HTML, MustNewLexer(
|
||||
&Config{
|
||||
Name: "Go HTML Template",
|
||||
Aliases: []string{"go-html-template"},
|
||||
},
|
||||
goTemplateRules,
|
||||
)))
|
||||
|
||||
var GoTextTemplate = internal.Register(MustNewLexer(
|
||||
&Config{
|
||||
Name: "Go Text Template",
|
||||
Aliases: []string{"go-text-template"},
|
||||
},
|
||||
goTemplateRules,
|
||||
))
|
||||
|
17
vendor/github.com/alecthomas/chroma/lexers/h/http.go
generated
vendored
17
vendor/github.com/alecthomas/chroma/lexers/h/http.go
generated
vendored
@@ -34,7 +34,7 @@ var HTTP = internal.Register(httpBodyContentTypeLexer(MustNewLexer(
|
||||
)))
|
||||
|
||||
func httpContentBlock(groups []string, lexer Lexer) Iterator {
|
||||
tokens := []*Token{
|
||||
tokens := []Token{
|
||||
{Generic, groups[0]},
|
||||
}
|
||||
return Literator(tokens...)
|
||||
@@ -42,7 +42,7 @@ func httpContentBlock(groups []string, lexer Lexer) Iterator {
|
||||
}
|
||||
|
||||
func httpHeaderBlock(groups []string, lexer Lexer) Iterator {
|
||||
tokens := []*Token{
|
||||
tokens := []Token{
|
||||
{Name, groups[1]},
|
||||
{Text, groups[2]},
|
||||
{Operator, groups[3]},
|
||||
@@ -54,7 +54,7 @@ func httpHeaderBlock(groups []string, lexer Lexer) Iterator {
|
||||
}
|
||||
|
||||
func httpContinuousHeaderBlock(groups []string, lexer Lexer) Iterator {
|
||||
tokens := []*Token{
|
||||
tokens := []Token{
|
||||
{Text, groups[1]},
|
||||
{Literal, groups[2]},
|
||||
{Text, groups[3]},
|
||||
@@ -76,8 +76,8 @@ func (d *httpBodyContentTyper) Tokenise(options *TokeniseOptions, text string) (
|
||||
return nil, err
|
||||
}
|
||||
|
||||
return func() *Token {
|
||||
for token := it(); token != nil; token = it() {
|
||||
return func() Token {
|
||||
for token := it(); token != EOF; token = it() {
|
||||
switch {
|
||||
case token.Type == Name && strings.ToLower(token.Value) == "content-type":
|
||||
{
|
||||
@@ -85,6 +85,7 @@ func (d *httpBodyContentTyper) Tokenise(options *TokeniseOptions, text string) (
|
||||
}
|
||||
case token.Type == Literal && isContentType:
|
||||
{
|
||||
isContentType = false
|
||||
contentType = strings.TrimSpace(token.Value)
|
||||
pos := strings.Index(contentType, ";")
|
||||
if pos > 0 {
|
||||
@@ -111,7 +112,7 @@ func (d *httpBodyContentTyper) Tokenise(options *TokeniseOptions, text string) (
|
||||
if err != nil {
|
||||
panic(err)
|
||||
}
|
||||
return nil
|
||||
return EOF
|
||||
}
|
||||
}
|
||||
|
||||
@@ -121,11 +122,11 @@ func (d *httpBodyContentTyper) Tokenise(options *TokeniseOptions, text string) (
|
||||
}
|
||||
|
||||
if subIterator != nil {
|
||||
for token := subIterator(); token != nil; token = subIterator() {
|
||||
for token := subIterator(); token != EOF; token = subIterator() {
|
||||
return token
|
||||
}
|
||||
}
|
||||
return nil
|
||||
return EOF
|
||||
|
||||
}, nil
|
||||
}
|
||||
|
2
vendor/github.com/alecthomas/chroma/lexers/i/ini.go
generated
vendored
2
vendor/github.com/alecthomas/chroma/lexers/i/ini.go
generated
vendored
@@ -10,7 +10,7 @@ var Ini = internal.Register(MustNewLexer(
|
||||
&Config{
|
||||
Name: "INI",
|
||||
Aliases: []string{"ini", "cfg", "dosini"},
|
||||
Filenames: []string{"*.ini", "*.cfg", "*.inf"},
|
||||
Filenames: []string{"*.ini", "*.cfg", "*.inf", ".gitconfig"},
|
||||
MimeTypes: []string{"text/x-ini", "text/inf"},
|
||||
},
|
||||
Rules{
|
||||
|
15
vendor/github.com/alecthomas/chroma/lexers/internal/api.go
generated
vendored
15
vendor/github.com/alecthomas/chroma/lexers/internal/api.go
generated
vendored
@@ -144,13 +144,16 @@ func Register(lexer chroma.Lexer) chroma.Lexer {
|
||||
return lexer
|
||||
}
|
||||
|
||||
// Fallback lexer if no other is found.
|
||||
var Fallback chroma.Lexer = chroma.MustNewLexer(&chroma.Config{
|
||||
Name: "fallback",
|
||||
Filenames: []string{"*"},
|
||||
}, chroma.Rules{
|
||||
// Used for the fallback lexer as well as the explicit plaintext lexer
|
||||
var PlaintextRules = chroma.Rules{
|
||||
"root": []chroma.Rule{
|
||||
{`.+`, chroma.Text, nil},
|
||||
{`\n`, chroma.Text, nil},
|
||||
},
|
||||
})
|
||||
}
|
||||
|
||||
// Fallback lexer if no other is found.
|
||||
var Fallback chroma.Lexer = chroma.MustNewLexer(&chroma.Config{
|
||||
Name: "fallback",
|
||||
Filenames: []string{"*"},
|
||||
}, PlaintextRules)
|
||||
|
4
vendor/github.com/alecthomas/chroma/lexers/j/json.go
generated
vendored
4
vendor/github.com/alecthomas/chroma/lexers/j/json.go
generated
vendored
@@ -5,8 +5,8 @@ import (
|
||||
"github.com/alecthomas/chroma/lexers/internal"
|
||||
)
|
||||
|
||||
// Json lexer.
|
||||
var Json = internal.Register(MustNewLexer(
|
||||
// JSON lexer.
|
||||
var JSON = internal.Register(MustNewLexer(
|
||||
&Config{
|
||||
Name: "JSON",
|
||||
Aliases: []string{"json"},
|
||||
|
50
vendor/github.com/alecthomas/chroma/lexers/j/jungle.go
generated
vendored
Normal file
50
vendor/github.com/alecthomas/chroma/lexers/j/jungle.go
generated
vendored
Normal file
@@ -0,0 +1,50 @@
|
||||
package j
|
||||
|
||||
import (
|
||||
. "github.com/alecthomas/chroma" // nolint
|
||||
"github.com/alecthomas/chroma/lexers/internal"
|
||||
)
|
||||
|
||||
var Jungle = internal.Register(MustNewLexer(
|
||||
&Config{
|
||||
Name: "Jungle",
|
||||
Aliases: []string{"jungle"},
|
||||
Filenames: []string{"*.jungle"},
|
||||
MimeTypes: []string{"text/x-jungle"},
|
||||
},
|
||||
Rules{
|
||||
"root": {
|
||||
{`[^\S\n]+`, Text, nil},
|
||||
{`\n`, Text, nil},
|
||||
{`#(\n|[\w\W]*?[^#]\n)`, CommentSingle, nil},
|
||||
{`^(?=\S)`, None, Push("instruction")},
|
||||
{`[\.;\[\]\(\)\$]`, Punctuation, nil},
|
||||
{`[a-zA-Z_]\w*`, Name, nil},
|
||||
},
|
||||
"instruction": {
|
||||
{`[^\S\n]+`, Text, nil},
|
||||
{`=`, Operator, Push("value")},
|
||||
{`(?=\S)`, None, Push("var")},
|
||||
Default(Pop(1)),
|
||||
},
|
||||
"value": {
|
||||
{`[^\S\n]+`, Text, nil},
|
||||
{`\$\(`, Punctuation, Push("var")},
|
||||
{`[;\[\]\(\)\$]`, Punctuation, nil},
|
||||
{`#(\n|[\w\W]*?[^#]\n)`, CommentSingle, nil},
|
||||
{`[\w_\-\.\/\\]+`, Text, nil},
|
||||
Default(Pop(1)),
|
||||
},
|
||||
"var": {
|
||||
{`[^\S\n]+`, Text, nil},
|
||||
{`\b(((re)?source|barrel)Path|excludeAnnotations|annotations|lang)\b`, NameBuiltin, nil},
|
||||
{`\bbase\b`, NameConstant, nil},
|
||||
{`\b(ind|zsm|hrv|ces|dan|dut|eng|fin|fre|deu|gre|hun|ita|nob|po[lr]|rus|sl[ov]|spa|swe|ara|heb|zh[st]|jpn|kor|tha|vie|bul|tur)`, NameConstant, nil},
|
||||
{`\b((semi)?round|rectangle)(-\d+x\d+)?\b`, NameConstant, nil},
|
||||
{`[\.;\[\]\(\$]`, Punctuation, nil},
|
||||
{`\)`, Punctuation, Pop(1)},
|
||||
{`[a-zA-Z_]\w*`, Name, nil},
|
||||
Default(Pop(1)),
|
||||
},
|
||||
},
|
||||
))
|
1
vendor/github.com/alecthomas/chroma/lexers/lexers.go
generated
vendored
1
vendor/github.com/alecthomas/chroma/lexers/lexers.go
generated
vendored
@@ -9,6 +9,7 @@ import (
|
||||
_ "github.com/alecthomas/chroma/lexers/a"
|
||||
_ "github.com/alecthomas/chroma/lexers/b"
|
||||
_ "github.com/alecthomas/chroma/lexers/c"
|
||||
_ "github.com/alecthomas/chroma/lexers/circular"
|
||||
_ "github.com/alecthomas/chroma/lexers/d"
|
||||
_ "github.com/alecthomas/chroma/lexers/e"
|
||||
_ "github.com/alecthomas/chroma/lexers/f"
|
||||
|
34
vendor/github.com/alecthomas/chroma/lexers/m/markdown.go
generated
vendored
34
vendor/github.com/alecthomas/chroma/lexers/m/markdown.go
generated
vendored
@@ -21,8 +21,15 @@ var Markdown = internal.Register(MustNewLexer(
|
||||
{`^(\s*)([*-])(\s)(.+\n)`, ByGroups(Text, Keyword, Text, UsingSelf("inline")), nil},
|
||||
{`^(\s*)([0-9]+\.)( .+\n)`, ByGroups(Text, Keyword, UsingSelf("inline")), nil},
|
||||
{`^(\s*>\s)(.+\n)`, ByGroups(Keyword, GenericEmph), nil},
|
||||
{"^(```\\n)([\\w\\W]*?)(^```$)", ByGroups(LiteralString, Text, LiteralString), nil},
|
||||
{"^(```)(\\w+)(\\n)([\\w\\W]*?)(^```$)", EmitterFunc(markdownCodeBlock), nil},
|
||||
{"^(```\\n)([\\w\\W]*?)(^```$)", ByGroups(String, Text, String), nil},
|
||||
{"^(```)(\\w+)(\\n)([\\w\\W]*?)(^```$)",
|
||||
UsingByGroup(
|
||||
internal.Get,
|
||||
2, 4,
|
||||
String, String, String, Text, String,
|
||||
),
|
||||
nil,
|
||||
},
|
||||
Include("inline"),
|
||||
},
|
||||
"inline": {
|
||||
@@ -38,26 +45,3 @@ var Markdown = internal.Register(MustNewLexer(
|
||||
},
|
||||
},
|
||||
))
|
||||
|
||||
func markdownCodeBlock(groups []string, lexer Lexer) Iterator {
|
||||
iterators := []Iterator{}
|
||||
tokens := []*Token{
|
||||
{String, groups[1]},
|
||||
{String, groups[2]},
|
||||
{Text, groups[3]},
|
||||
}
|
||||
code := groups[4]
|
||||
lexer = internal.Get(groups[2])
|
||||
if lexer == nil {
|
||||
tokens = append(tokens, &Token{String, code})
|
||||
iterators = append(iterators, Literator(tokens...))
|
||||
} else {
|
||||
sub, err := lexer.Tokenise(nil, code)
|
||||
if err != nil {
|
||||
panic(err)
|
||||
}
|
||||
iterators = append(iterators, Literator(tokens...), sub)
|
||||
}
|
||||
iterators = append(iterators, Literator(&Token{String, groups[5]}))
|
||||
return Concaterator(iterators...)
|
||||
}
|
||||
|
62
vendor/github.com/alecthomas/chroma/lexers/m/monkeyc.go
generated
vendored
Normal file
62
vendor/github.com/alecthomas/chroma/lexers/m/monkeyc.go
generated
vendored
Normal file
@@ -0,0 +1,62 @@
|
||||
package m
|
||||
|
||||
import (
|
||||
. "github.com/alecthomas/chroma" // nolint
|
||||
"github.com/alecthomas/chroma/lexers/internal"
|
||||
)
|
||||
|
||||
var MonkeyC = internal.Register(MustNewLexer(
|
||||
&Config{
|
||||
Name: "MonkeyC",
|
||||
Aliases: []string{"monkeyc"},
|
||||
Filenames: []string{"*.mc"},
|
||||
MimeTypes: []string{"text/x-monkeyc"},
|
||||
},
|
||||
Rules{
|
||||
"root": {
|
||||
{`[^\S\n]+`, Text, nil},
|
||||
{`\n`, Text, nil},
|
||||
{`//(\n|[\w\W]*?[^\\]\n)`, CommentSingle, nil},
|
||||
{`/(\\\n)?[*][\w\W]*?[*](\\\n)?/`, CommentMultiline, nil},
|
||||
{`/(\\\n)?[*][\w\W]*`, CommentMultiline, nil},
|
||||
{`:[a-zA-Z_][\w_\.]*`, StringSymbol, nil},
|
||||
{`[{}\[\]\(\),;:\.]`, Punctuation, nil},
|
||||
{`[&~\|\^!+\-*\/%=?]`, Operator, nil},
|
||||
{`=>|[+-]=|&&|\|\||>>|<<|[<>]=?|[!=]=`, Operator, nil},
|
||||
{`\b(and|or|instanceof|has|extends|new)`, OperatorWord, nil},
|
||||
{Words(``, `\b`, `NaN`, `null`, `true`, `false`), KeywordConstant, nil},
|
||||
{`(using)((?:\s|\\\\s)+)`, ByGroups(KeywordNamespace, Text), Push("import")},
|
||||
{`(class)((?:\s|\\\\s)+)`, ByGroups(KeywordDeclaration, Text), Push("class")},
|
||||
{`(function)((?:\s|\\\\s)+)`, ByGroups(KeywordDeclaration, Text), Push("function")},
|
||||
{`(module)((?:\s|\\\\s)+)`, ByGroups(KeywordDeclaration, Text), Push("module")},
|
||||
{`\b(if|else|for|switch|case|while|break|continue|default|do|try|catch|finally|return|throw|extends|function)\b`, Keyword, nil},
|
||||
{`\b(const|enum|hidden|public|protected|private|static)\b`, KeywordType, nil},
|
||||
{`\bvar\b`, KeywordDeclaration, nil},
|
||||
{`\b(Activity(Monitor|Recording)?|Ant(Plus)?|Application|Attention|Background|Communications|Cryptography|FitContributor|Graphics|Gregorian|Lang|Math|Media|Persisted(Content|Locations)|Position|Properties|Sensor(History|Logging)?|Storage|StringUtil|System|Test|Time(r)?|Toybox|UserProfile|WatchUi|Rez|Drawables|Strings|Fonts|method)\b`, NameBuiltin, nil},
|
||||
{`\b(me|self|\$)\b`, NameBuiltinPseudo, nil},
|
||||
{`"(\\\\|\\"|[^"])*"`, LiteralStringDouble, nil},
|
||||
{`'(\\\\|\\'|[^''])*'`, LiteralStringSingle, nil},
|
||||
{`-?(0x[0-9a-fA-F]+l?)`, NumberHex, nil},
|
||||
{`-?([0-9]+(\.[0-9]+[df]?|[df]))\b`, NumberFloat, nil},
|
||||
{`-?([0-9]+l?)`, NumberInteger, nil},
|
||||
{`[a-zA-Z_]\w*`, Name, nil},
|
||||
},
|
||||
"import": {
|
||||
{`([a-zA-Z_][\w_\.]*)(?:(\s+)(as)(\s+)([a-zA-Z_][\w_]*))?`, ByGroups(NameNamespace, Text, KeywordNamespace, Text, NameNamespace), nil},
|
||||
Default(Pop(1)),
|
||||
},
|
||||
"class": {
|
||||
{`([a-zA-Z_][\w_\.]*)(?:(\s+)(extends)(\s+)([a-zA-Z_][\w_\.]*))?`, ByGroups(NameClass, Text, KeywordDeclaration, Text, NameClass), nil},
|
||||
Default(Pop(1)),
|
||||
},
|
||||
"function": {
|
||||
{`initialize`, NameFunctionMagic, nil},
|
||||
{`[a-zA-Z_][\w_\.]*`, NameFunction, nil},
|
||||
Default(Pop(1)),
|
||||
},
|
||||
"module": {
|
||||
{`[a-zA-Z_][\w_\.]*`, NameNamespace, nil},
|
||||
Default(Pop(1)),
|
||||
},
|
||||
},
|
||||
))
|
1
vendor/github.com/alecthomas/chroma/lexers/n/nim.go
generated
vendored
1
vendor/github.com/alecthomas/chroma/lexers/n/nim.go
generated
vendored
@@ -31,6 +31,7 @@ var Nim = internal.Register(MustNewLexer(
|
||||
{`(v_?a_?r)\b`, KeywordDeclaration, nil},
|
||||
{`(i_?n_?t_?|i_?n_?t_?8_?|i_?n_?t_?1_?6_?|i_?n_?t_?3_?2_?|i_?n_?t_?6_?4_?|f_?l_?o_?a_?t_?|f_?l_?o_?a_?t_?3_?2_?|f_?l_?o_?a_?t_?6_?4_?|b_?o_?o_?l_?|c_?h_?a_?r_?|r_?a_?n_?g_?e_?|a_?r_?r_?a_?y_?|s_?e_?q_?|s_?e_?t_?|s_?t_?r_?i_?n_?g_?)\b`, KeywordType, nil},
|
||||
{`(n_?i_?l_?|t_?r_?u_?e_?|f_?a_?l_?s_?e_?)\b`, KeywordPseudo, nil},
|
||||
{`\b_\b`, Name, nil}, // Standalone _ used as discardable variable identifier
|
||||
{`\b((?![_\d])\w)(((?!_)\w)|(_(?!_)\w))*`, Name, nil},
|
||||
{`[0-9][0-9_]*(?=([e.]|\'f(32|64)))`, LiteralNumberFloat, Push("float-suffix", "float-number")},
|
||||
{`0x[a-f0-9][a-f0-9_]*`, LiteralNumberHex, Push("int-suffix")},
|
||||
|
43
vendor/github.com/alecthomas/chroma/lexers/o/openscad.go
generated
vendored
Normal file
43
vendor/github.com/alecthomas/chroma/lexers/o/openscad.go
generated
vendored
Normal file
@@ -0,0 +1,43 @@
|
||||
package o
|
||||
|
||||
import (
|
||||
. "github.com/alecthomas/chroma" // nolint
|
||||
"github.com/alecthomas/chroma/lexers/internal"
|
||||
)
|
||||
|
||||
var OpenSCAD = internal.Register(MustNewLexer(
|
||||
&Config{
|
||||
Name: "OpenSCAD",
|
||||
Aliases: []string{"openscad"},
|
||||
Filenames: []string{"*.scad"},
|
||||
MimeTypes: []string{"text/x-scad"},
|
||||
},
|
||||
Rules{
|
||||
"root": {
|
||||
{`[^\S\n]+`, Text, nil},
|
||||
{`\n`, Text, nil},
|
||||
{`//(\n|[\w\W]*?[^\\]\n)`, CommentSingle, nil},
|
||||
{`/(\\\n)?[*][\w\W]*?[*](\\\n)?/`, CommentMultiline, nil},
|
||||
{`/(\\\n)?[*][\w\W]*`, CommentMultiline, nil},
|
||||
{`[{}\[\]\(\),;:]`, Punctuation, nil},
|
||||
{`[*!#%\-+=?/]`, Operator, nil},
|
||||
{`<|<=|==|!=|>=|>|&&|\|\|`, Operator, nil},
|
||||
{`\$(f[asn]|t|vp[rtd]|children)`, NameVariableMagic, nil},
|
||||
{Words(``, `\b`, `PI`, `undef`), KeywordConstant, nil},
|
||||
{`(use|include)((?:\s|\\\\s)+)`, ByGroups(KeywordNamespace, Text), Push("includes")},
|
||||
{`(module)(\s*)([^\s\(]+)`, ByGroups(KeywordNamespace, Text, NameNamespace), nil},
|
||||
{`(function)(\s*)([^\s\(]+)`, ByGroups(KeywordDeclaration, Text, NameFunction), nil},
|
||||
{`\b(true|false)\b`, Literal, nil},
|
||||
{`\b(function|module|include|use|for|intersection_for|if|else|return)\b`, Keyword, nil},
|
||||
{`\b(circle|square|polygon|text|sphere|cube|cylinder|polyhedron|translate|rotate|scale|resize|mirror|multmatrix|color|offset|hull|minkowski|union|difference|intersection|abs|sign|sin|cos|tan|acos|asin|atan|atan2|floor|round|ceil|ln|log|pow|sqrt|exp|rands|min|max|concat|lookup|str|chr|search|version|version_num|norm|cross|parent_module|echo|import|import_dxf|dxf_linear_extrude|linear_extrude|rotate_extrude|surface|projection|render|dxf_cross|dxf_dim|let|assign|len)\b`, NameBuiltin, nil},
|
||||
{`\bchildren\b`, NameBuiltinPseudo, nil},
|
||||
{`"(\\\\|\\"|[^"])*"`, LiteralStringDouble, nil},
|
||||
{`-?\d+(\.\d+)?(e[+-]?\d+)?`, Number, nil},
|
||||
{`[a-zA-Z_]\w*`, Name, nil},
|
||||
},
|
||||
"includes": {
|
||||
{"(<)([^>]*)(>)", ByGroups(Punctuation, CommentPreprocFile, Punctuation), nil},
|
||||
Default(Pop(1)),
|
||||
},
|
||||
},
|
||||
))
|
102
vendor/github.com/alecthomas/chroma/lexers/o/org.go
generated
vendored
Normal file
102
vendor/github.com/alecthomas/chroma/lexers/o/org.go
generated
vendored
Normal file
@@ -0,0 +1,102 @@
|
||||
package o
|
||||
|
||||
import (
|
||||
. "github.com/alecthomas/chroma" // nolint
|
||||
"github.com/alecthomas/chroma/lexers/internal"
|
||||
)
|
||||
|
||||
// Org mode lexer.
|
||||
var Org = internal.Register(MustNewLexer(
|
||||
&Config{
|
||||
Name: "Org Mode",
|
||||
Aliases: []string{"org", "orgmode"},
|
||||
Filenames: []string{"*.org"},
|
||||
MimeTypes: []string{"text/org"}, // https://lists.gnu.org/r/emacs-orgmode/2017-09/msg00087.html
|
||||
},
|
||||
Rules{
|
||||
"root": {
|
||||
{`^# .*$`, Comment, nil},
|
||||
// Headings
|
||||
{`^(\*)( COMMENT)( .*)$`, ByGroups(GenericHeading, NameEntity, GenericStrong), nil},
|
||||
{`^(\*\*+)( COMMENT)( .*)$`, ByGroups(GenericSubheading, NameEntity, Text), nil},
|
||||
{`^(\*)( DONE)( .*)$`, ByGroups(GenericHeading, LiteralStringRegex, GenericStrong), nil},
|
||||
{`^(\*\*+)( DONE)( .*)$`, ByGroups(GenericSubheading, LiteralStringRegex, Text), nil},
|
||||
{`^(\*)( TODO)( .*)$`, ByGroups(GenericHeading, Error, GenericStrong), nil},
|
||||
{`^(\*\*+)( TODO)( .*)$`, ByGroups(GenericSubheading, Error, Text), nil},
|
||||
{`^(\*)( .+?)( :[a-zA-Z0-9_@:]+:)$`, ByGroups(GenericHeading, GenericStrong, GenericEmph), nil}, // Level 1 heading with tags
|
||||
{`^(\*)( .+)$`, ByGroups(GenericHeading, GenericStrong), nil}, // // Level 1 heading with NO tags
|
||||
{`^(\*\*+)( .+?)( :[a-zA-Z0-9_@:]+:)$`, ByGroups(GenericSubheading, Text, GenericEmph), nil}, // Level 2+ heading with tags
|
||||
{`^(\*\*+)( .+)$`, ByGroups(GenericSubheading, Text), nil}, // Level 2+ heading with NO tags
|
||||
// Checkbox lists
|
||||
{`^( *)([+-] )(\[[ X]\])( .+)$`, ByGroups(Text, Keyword, Keyword, UsingSelf("inline")), nil},
|
||||
{`^( +)(\* )(\[[ X]\])( .+)$`, ByGroups(Text, Keyword, Keyword, UsingSelf("inline")), nil},
|
||||
// Definition lists
|
||||
{`^( *)([+-] )([^ \n]+ ::)( .+)$`, ByGroups(Text, Keyword, Keyword, UsingSelf("inline")), nil},
|
||||
{`^( +)(\* )([^ \n]+ ::)( .+)$`, ByGroups(Text, Keyword, Keyword, UsingSelf("inline")), nil},
|
||||
// Unordered lists
|
||||
{`^( *)([+-] )(.+)$`, ByGroups(Text, Keyword, UsingSelf("inline")), nil},
|
||||
{`^( +)(\* )(.+)$`, ByGroups(Text, Keyword, UsingSelf("inline")), nil},
|
||||
// Ordered lists
|
||||
{`^( *)([0-9]+[.)])( \[@[0-9]+\])( .+)$`, ByGroups(Text, Keyword, GenericEmph, UsingSelf("inline")), nil},
|
||||
{`^( *)([0-9]+[.)])( .+)$`, ByGroups(Text, Keyword, UsingSelf("inline")), nil},
|
||||
// Dynamic Blocks
|
||||
{`(?i)^( *#\+begin: )([^ ]+)([\w\W]*?\n)([\w\W]*?)(^ *#\+end: *$)`, ByGroups(Comment, CommentSpecial, Comment, UsingSelf("inline"), Comment), nil},
|
||||
// Blocks
|
||||
// - Comment Blocks
|
||||
{`(?i)^( *#\+begin_comment *\n)([\w\W]*?)(^ *#\+end_comment *$)`, ByGroups(Comment, Comment, Comment), nil},
|
||||
// - Src Blocks
|
||||
{`(?i)^( *#\+begin_src )([^ \n]+)(.*?\n)([\w\W]*?)(^ *#\+end_src *$)`,
|
||||
UsingByGroup(
|
||||
internal.Get,
|
||||
2, 4,
|
||||
Comment, CommentSpecial, Comment, Text, Comment,
|
||||
),
|
||||
nil,
|
||||
},
|
||||
// - Export Blocks
|
||||
{`(?i)^( *#\+begin_export )(\w+)( *\n)([\w\W]*?)(^ *#\+end_export *$)`,
|
||||
UsingByGroup(
|
||||
internal.Get,
|
||||
2, 4,
|
||||
Comment, CommentSpecial, Text, Text, Comment,
|
||||
),
|
||||
nil,
|
||||
},
|
||||
// - Org Special, Example, Verse, etc. Blocks
|
||||
{`(?i)^( *#\+begin_)(\w+)( *\n)([\w\W]*?)(^ *#\+end_\2)( *$)`, ByGroups(Comment, Comment, Text, Text, Comment, Text), nil},
|
||||
// Keywords
|
||||
{`^(#\+\w+)(:.*)$`, ByGroups(CommentSpecial, Comment), nil}, // Other Org keywords like #+title
|
||||
// Properties and Drawers
|
||||
{`(?i)^( *:\w+: *\n)([\w\W]*?)(^ *:end: *$)`, ByGroups(Comment, CommentSpecial, Comment), nil},
|
||||
// Line break operator
|
||||
{`^(.*)(\\\\)$`, ByGroups(UsingSelf("inline"), Operator), nil},
|
||||
// Deadline/Scheduled
|
||||
{`(?i)^( *(?:DEADLINE|SCHEDULED): )(<[^<>]+?> *)$`, ByGroups(Comment, CommentSpecial), nil}, // DEADLINE/SCHEDULED: <datestamp>
|
||||
// DONE state CLOSED
|
||||
{`(?i)^( *CLOSED: )(\[[^][]+?\] *)$`, ByGroups(Comment, CommentSpecial), nil}, // CLOSED: [datestamp]
|
||||
// All other lines
|
||||
Include("inline"),
|
||||
},
|
||||
"inline": {
|
||||
{`(\s)*(\*[^ \n*][^*]+?[^ \n*]\*)((?=\W|\n|$))`, ByGroups(Text, GenericStrong, Text), nil}, // Bold
|
||||
{`(\s)*(/[^/]+?/)((?=\W|\n|$))`, ByGroups(Text, GenericEmph, Text), nil}, // Italic
|
||||
{`(\s)*(=[^\n=]+?=)((?=\W|\n|$))`, ByGroups(Text, NameClass, Text), nil}, // Verbatim
|
||||
{`(\s)*(~[^\n~]+?~)((?=\W|\n|$))`, ByGroups(Text, NameClass, Text), nil}, // Code
|
||||
{`(\s)*(\+[^+]+?\+)((?=\W|\n|$))`, ByGroups(Text, GenericDeleted, Text), nil}, // Strikethrough
|
||||
{`(\s)*(_[^_]+?_)((?=\W|\n|$))`, ByGroups(Text, GenericUnderline, Text), nil}, // Underline
|
||||
{`(<)([^<>]+?)(>)`, ByGroups(Text, String, Text), nil}, // <datestamp>
|
||||
{`[{]{3}[^}]+[}]{3}`, NameBuiltin, nil}, // {{{macro(foo,1)}}}
|
||||
{`([^[])(\[fn:)([^]]+?)(\])([^]])`, ByGroups(Text, NameBuiltinPseudo, LiteralString, NameBuiltinPseudo, Text), nil}, // [fn:1]
|
||||
// Links
|
||||
{`(\[\[)([^][]+?)(\]\[)([^][]+)(\]\])`, ByGroups(Text, NameAttribute, Text, NameTag, Text), nil}, // [[link][descr]]
|
||||
{`(\[\[)([^][]+?)(\]\])`, ByGroups(Text, NameAttribute, Text), nil}, // [[link]]
|
||||
{`(<<)([^<>]+?)(>>)`, ByGroups(Text, NameAttribute, Text), nil}, // <<targetlink>>
|
||||
// Tables
|
||||
{`^( *)(\|[ -].*?[ -]\|)$`, ByGroups(Text, String), nil},
|
||||
// Blank lines, newlines
|
||||
{`\n`, Text, nil},
|
||||
// Any other text
|
||||
{`.`, Text, nil},
|
||||
},
|
||||
},
|
||||
))
|
16
vendor/github.com/alecthomas/chroma/lexers/p/plaintext.go
generated
vendored
Normal file
16
vendor/github.com/alecthomas/chroma/lexers/p/plaintext.go
generated
vendored
Normal file
@@ -0,0 +1,16 @@
|
||||
package p
|
||||
|
||||
import (
|
||||
. "github.com/alecthomas/chroma" // nolint
|
||||
"github.com/alecthomas/chroma/lexers/internal"
|
||||
)
|
||||
|
||||
var Plaintext = internal.Register(MustNewLexer(
|
||||
&Config{
|
||||
Name: "plaintext",
|
||||
Aliases: []string{"text", "plain", "no-highlight"},
|
||||
Filenames: []string{"*.txt"},
|
||||
MimeTypes: []string{"text/plain"},
|
||||
},
|
||||
internal.PlaintextRules,
|
||||
))
|
25
vendor/github.com/alecthomas/chroma/lexers/p/postgres.go
generated
vendored
25
vendor/github.com/alecthomas/chroma/lexers/p/postgres.go
generated
vendored
@@ -21,6 +21,18 @@ var PostgreSQL = internal.Register(MustNewLexer(
|
||||
{`--.*\n?`, CommentSingle, nil},
|
||||
{`/\*`, CommentMultiline, Push("multiline-comments")},
|
||||
{`(bigint|bigserial|bit|bit\s+varying|bool|boolean|box|bytea|char|character|character\s+varying|cidr|circle|date|decimal|double\s+precision|float4|float8|inet|int|int2|int4|int8|integer|interval|json|jsonb|line|lseg|macaddr|money|numeric|path|pg_lsn|point|polygon|real|serial|serial2|serial4|serial8|smallint|smallserial|text|time|timestamp|timestamptz|timetz|tsquery|tsvector|txid_snapshot|uuid|varbit|varchar|with\s+time\s+zone|without\s+time\s+zone|xml|anyarray|anyelement|anyenum|anynonarray|anyrange|cstring|fdw_handler|internal|language_handler|opaque|record|void)\b`, NameBuiltin, nil},
|
||||
{`(?s)(DO)(\s+)(?:(LANGUAGE)?(\s+)('?)(\w+)?('?)(\s+))?(\$)([^$]*)(\$)(.*?)(\$)(\10)(\$)`,
|
||||
UsingByGroup(
|
||||
internal.Get,
|
||||
6, 12,
|
||||
Keyword, Text, Keyword, Text, // DO LANGUAGE
|
||||
StringSingle, StringSingle, StringSingle, Text, // 'plpgsql'
|
||||
StringHeredoc, StringHeredoc, StringHeredoc, // $tag$
|
||||
StringHeredoc, // (code block)
|
||||
StringHeredoc, StringHeredoc, StringHeredoc, // $tag$
|
||||
),
|
||||
nil,
|
||||
},
|
||||
{Words(``, `\b`, `ABORT`, `ABSOLUTE`, `ACCESS`, `ACTION`, `ADD`, `ADMIN`, `AFTER`, `AGGREGATE`, `ALL`, `ALSO`, `ALTER`, `ALWAYS`, `ANALYSE`, `ANALYZE`, `AND`, `ANY`, `ARRAY`, `AS`, `ASC`, `ASSERTION`, `ASSIGNMENT`, `ASYMMETRIC`, `AT`, `ATTRIBUTE`, `AUTHORIZATION`, `BACKWARD`, `BEFORE`, `BEGIN`, `BETWEEN`, `BIGINT`, `BINARY`, `BIT`, `BOOLEAN`, `BOTH`, `BY`, `CACHE`, `CALLED`, `CASCADE`, `CASCADED`, `CASE`, `CAST`, `CATALOG`, `CHAIN`, `CHAR`, `CHARACTER`, `CHARACTERISTICS`, `CHECK`, `CHECKPOINT`, `CLASS`, `CLOSE`, `CLUSTER`, `COALESCE`, `COLLATE`, `COLLATION`, `COLUMN`, `COMMENT`, `COMMENTS`, `COMMIT`, `COMMITTED`, `CONCURRENTLY`, `CONFIGURATION`, `CONNECTION`, `CONSTRAINT`, `CONSTRAINTS`, `CONTENT`, `CONTINUE`, `CONVERSION`, `COPY`, `COST`, `CREATE`, `CROSS`, `CSV`, `CURRENT`, `CURRENT_CATALOG`, `CURRENT_DATE`, `CURRENT_ROLE`, `CURRENT_SCHEMA`, `CURRENT_TIME`, `CURRENT_TIMESTAMP`, `CURRENT_USER`, `CURSOR`, `CYCLE`, `DATA`, `DATABASE`, `DAY`, `DEALLOCATE`, `DEC`, `DECIMAL`, `DECLARE`, `DEFAULT`, `DEFAULTS`, `DEFERRABLE`, `DEFERRED`, `DEFINER`, `DELETE`, `DELIMITER`, `DELIMITERS`, `DESC`, `DICTIONARY`, `DISABLE`, `DISCARD`, `DISTINCT`, `DO`, `DOCUMENT`, `DOMAIN`, `DOUBLE`, `DROP`, `EACH`, `ELSE`, `ENABLE`, `ENCODING`, `ENCRYPTED`, `END`, `ENUM`, `ESCAPE`, `EVENT`, `EXCEPT`, `EXCLUDE`, `EXCLUDING`, `EXCLUSIVE`, `EXECUTE`, `EXISTS`, `EXPLAIN`, `EXTENSION`, `EXTERNAL`, `EXTRACT`, `FALSE`, `FAMILY`, `FETCH`, `FILTER`, `FIRST`, `FLOAT`, `FOLLOWING`, `FOR`, `FORCE`, `FOREIGN`, `FORWARD`, `FREEZE`, `FROM`, `FULL`, `FUNCTION`, `FUNCTIONS`, `GLOBAL`, `GRANT`, `GRANTED`, `GREATEST`, `GROUP`, `HANDLER`, `HAVING`, `HEADER`, `HOLD`, `HOUR`, `IDENTITY`, `IF`, `ILIKE`, `IMMEDIATE`, `IMMUTABLE`, `IMPLICIT`, `IN`, `INCLUDING`, `INCREMENT`, `INDEX`, `INDEXES`, `INHERIT`, `INHERITS`, `INITIALLY`, `INLINE`, `INNER`, `INOUT`, `INPUT`, `INSENSITIVE`, `INSERT`, `INSTEAD`, `INT`, `INTEGER`, `INTERSECT`, `INTERVAL`, `INTO`, `INVOKER`, `IS`, `ISNULL`, `ISOLATION`, `JOIN`, `KEY`, `LABEL`, `LANGUAGE`, `LARGE`, `LAST`, `LATERAL`, `LC_COLLATE`, `LC_CTYPE`, `LEADING`, `LEAKPROOF`, `LEAST`, `LEFT`, `LEVEL`, `LIKE`, `LIMIT`, `LISTEN`, `LOAD`, `LOCAL`, `LOCALTIME`, `LOCALTIMESTAMP`, `LOCATION`, `LOCK`, `MAPPING`, `MATCH`, `MATERIALIZED`, `MAXVALUE`, `MINUTE`, `MINVALUE`, `MODE`, `MONTH`, `MOVE`, `NAME`, `NAMES`, `NATIONAL`, `NATURAL`, `NCHAR`, `NEXT`, `NO`, `NONE`, `NOT`, `NOTHING`, `NOTIFY`, `NOTNULL`, `NOWAIT`, `NULL`, `NULLIF`, `NULLS`, `NUMERIC`, `OBJECT`, `OF`, `OFF`, `OFFSET`, `OIDS`, `ON`, `ONLY`, `OPERATOR`, `OPTION`, `OPTIONS`, `OR`, `ORDER`, `ORDINALITY`, `OUT`, `OUTER`, `OVER`, `OVERLAPS`, `OVERLAY`, `OWNED`, `OWNER`, `PARSER`, `PARTIAL`, `PARTITION`, `PASSING`, `PASSWORD`, `PLACING`, `PLANS`, `POLICY`, `POSITION`, `PRECEDING`, `PRECISION`, `PREPARE`, `PREPARED`, `PRESERVE`, `PRIMARY`, `PRIOR`, `PRIVILEGES`, `PROCEDURAL`, `PROCEDURE`, `PROGRAM`, `QUOTE`, `RANGE`, `READ`, `REAL`, `REASSIGN`, `RECHECK`, `RECURSIVE`, `REF`, `REFERENCES`, `REFRESH`, `REINDEX`, `RELATIVE`, `RELEASE`, `RENAME`, `REPEATABLE`, `REPLACE`, `REPLICA`, `RESET`, `RESTART`, `RESTRICT`, `RETURNING`, `RETURNS`, `REVOKE`, `RIGHT`, `ROLE`, `ROLLBACK`, `ROW`, `ROWS`, `RULE`, `SAVEPOINT`, `SCHEMA`, `SCROLL`, `SEARCH`, `SECOND`, `SECURITY`, `SELECT`, `SEQUENCE`, `SEQUENCES`, `SERIALIZABLE`, `SERVER`, `SESSION`, `SESSION_USER`, `SET`, `SETOF`, `SHARE`, `SHOW`, `SIMILAR`, `SIMPLE`, `SMALLINT`, `SNAPSHOT`, `SOME`, `STABLE`, `STANDALONE`, `START`, `STATEMENT`, `STATISTICS`, `STDIN`, `STDOUT`, `STORAGE`, `STRICT`, `STRIP`, `SUBSTRING`, `SYMMETRIC`, `SYSID`, `SYSTEM`, `TABLE`, `TABLES`, `TABLESPACE`, `TEMP`, `TEMPLATE`, `TEMPORARY`, `TEXT`, `THEN`, `TIME`, `TIMESTAMP`, `TO`, `TRAILING`, `TRANSACTION`, `TREAT`, `TRIGGER`, `TRIM`, `TRUE`, `TRUNCATE`, `TRUSTED`, `TYPE`, `TYPES`, `UNBOUNDED`, `UNCOMMITTED`, `UNENCRYPTED`, `UNION`, `UNIQUE`, `UNKNOWN`, `UNLISTEN`, `UNLOGGED`, `UNTIL`, `UPDATE`, `USER`, `USING`, `VACUUM`, `VALID`, `VALIDATE`, `VALIDATOR`, `VALUE`, `VALUES`, `VARCHAR`, `VARIADIC`, `VARYING`, `VERBOSE`, `VERSION`, `VIEW`, `VIEWS`, `VOLATILE`, `WHEN`, `WHERE`, `WHITESPACE`, `WINDOW`, `WITH`, `WITHIN`, `WITHOUT`, `WORK`, `WRAPPER`, `WRITE`, `XML`, `XMLATTRIBUTES`, `XMLCONCAT`, `XMLELEMENT`, `XMLEXISTS`, `XMLFOREST`, `XMLPARSE`, `XMLPI`, `XMLROOT`, `XMLSERIALIZE`, `YEAR`, `YES`, `ZONE`), Keyword, nil},
|
||||
{"[+*/<>=~!@#%^&|`?-]+", Operator, nil},
|
||||
{`::`, Operator, nil},
|
||||
@@ -29,7 +41,18 @@ var PostgreSQL = internal.Register(MustNewLexer(
|
||||
{`[0-9]+`, LiteralNumberInteger, nil},
|
||||
{`((?:E|U&)?)(')`, ByGroups(LiteralStringAffix, LiteralStringSingle), Push("string")},
|
||||
{`((?:U&)?)(")`, ByGroups(LiteralStringAffix, LiteralStringName), Push("quoted-ident")},
|
||||
// { `(?s)(\$)([^$]*)(\$)(.*?)(\$)(\2)(\$)`, ?? <function language_callback at 0x101105400> ??, nil },
|
||||
{`(?s)(\$)([^$]*)(\$)(.*?)(\$)(\2)(\$)(\s+)(LANGUAGE)?(\s+)('?)(\w+)?('?)`,
|
||||
UsingByGroup(internal.Get,
|
||||
12, 4,
|
||||
StringHeredoc, StringHeredoc, StringHeredoc, // $tag$
|
||||
StringHeredoc, // (code block)
|
||||
StringHeredoc, StringHeredoc, StringHeredoc, // $tag$
|
||||
Text, Keyword, Text, // <space> LANGUAGE <space>
|
||||
StringSingle, StringSingle, StringSingle, // 'type'
|
||||
),
|
||||
nil,
|
||||
},
|
||||
{`(?s)(\$)([^$]*)(\$)(.*?)(\$)(\2)(\$)`, LiteralStringHeredoc, nil},
|
||||
{`[a-z_]\w*`, Name, nil},
|
||||
{`:(['"]?)[a-z]\w*\b\1`, NameVariable, nil},
|
||||
{`[;:()\[\]{},.]`, Punctuation, nil},
|
||||
|
4
vendor/github.com/alecthomas/chroma/lexers/r/rst.go
generated
vendored
4
vendor/github.com/alecthomas/chroma/lexers/r/rst.go
generated
vendored
@@ -61,7 +61,7 @@ var Restructuredtext = internal.Register(MustNewLexer(
|
||||
|
||||
func rstCodeBlock(groups []string, lexer Lexer) Iterator {
|
||||
iterators := []Iterator{}
|
||||
tokens := []*Token{
|
||||
tokens := []Token{
|
||||
{Punctuation, groups[1]},
|
||||
{Text, groups[2]},
|
||||
{OperatorWord, groups[3]},
|
||||
@@ -73,7 +73,7 @@ func rstCodeBlock(groups []string, lexer Lexer) Iterator {
|
||||
code := strings.Join(groups[8:], "")
|
||||
lexer = internal.Get(groups[6])
|
||||
if lexer == nil {
|
||||
tokens = append(tokens, &Token{String, code})
|
||||
tokens = append(tokens, Token{String, code})
|
||||
iterators = append(iterators, Literator(tokens...))
|
||||
} else {
|
||||
sub, err := lexer.Tokenise(nil, code)
|
||||
|
1
vendor/github.com/alecthomas/chroma/lexers/r/rust.go
generated
vendored
1
vendor/github.com/alecthomas/chroma/lexers/r/rust.go
generated
vendored
@@ -12,6 +12,7 @@ var Rust = internal.Register(MustNewLexer(
|
||||
Aliases: []string{"rust"},
|
||||
Filenames: []string{"*.rs", "*.rs.in"},
|
||||
MimeTypes: []string{"text/rust"},
|
||||
EnsureNL: true,
|
||||
},
|
||||
Rules{
|
||||
"root": {
|
||||
|
14
vendor/github.com/alecthomas/chroma/lexers/s/sass.go
generated
vendored
14
vendor/github.com/alecthomas/chroma/lexers/s/sass.go
generated
vendored
@@ -15,13 +15,13 @@ var Sass = internal.Register(MustNewLexer(
|
||||
CaseInsensitive: true,
|
||||
},
|
||||
Rules{
|
||||
// "root": {
|
||||
// },
|
||||
"root": {
|
||||
{`[ \t]*\n`, Text, nil},
|
||||
// { `[ \t]*`, ?? <function _indentation at 0x10fcaf1e0> ??, nil },
|
||||
},
|
||||
"content": {
|
||||
// { `//[^\n]*`, ?? <function _starts_block.<locals>.callback at 0x10fcaf378> ??, Push("root") },
|
||||
// { `/\*[^\n]*`, ?? <function _starts_block.<locals>.callback at 0x10fcaf400> ??, Push("root") },
|
||||
// { `[ \t]*`, ?? <function _indentation at 0x106932e18> ??, nil },
|
||||
// { `//[^\n]*`, ?? <function _starts_block.<locals>.callback at 0x106936048> ??, Push("root") },
|
||||
// { `/\*[^\n]*`, ?? <function _starts_block.<locals>.callback at 0x1069360d0> ??, Push("root") },
|
||||
{`@import`, Keyword, Push("import")},
|
||||
{`@for`, Keyword, Push("for")},
|
||||
{`@(debug|warn|if|while)`, Keyword, Push("value")},
|
||||
@@ -112,9 +112,9 @@ var Sass = internal.Register(MustNewLexer(
|
||||
{`"`, LiteralStringDouble, Pop(1)},
|
||||
},
|
||||
"string-single": {
|
||||
{`(\\.|#(?=[^\n{])|[^\n'#])+`, LiteralStringDouble, nil},
|
||||
{`(\\.|#(?=[^\n{])|[^\n'#])+`, LiteralStringSingle, nil},
|
||||
{`#\{`, LiteralStringInterpol, Push("interpolation")},
|
||||
{`'`, LiteralStringDouble, Pop(1)},
|
||||
{`'`, LiteralStringSingle, Pop(1)},
|
||||
},
|
||||
"string-url": {
|
||||
{`(\\#|#(?=[^\n{])|[^\n#)])+`, LiteralStringOther, nil},
|
||||
|
4
vendor/github.com/alecthomas/chroma/lexers/s/smarty.go
generated
vendored
4
vendor/github.com/alecthomas/chroma/lexers/s/smarty.go
generated
vendored
@@ -1,9 +1,9 @@
|
||||
package s
|
||||
|
||||
import (
|
||||
. "github.com/alecthomas/chroma" // nolint
|
||||
. "github.com/alecthomas/chroma" // nolint
|
||||
. "github.com/alecthomas/chroma/lexers/circular" // nolint
|
||||
"github.com/alecthomas/chroma/lexers/internal"
|
||||
. "github.com/alecthomas/chroma/lexers/p" // nolint
|
||||
)
|
||||
|
||||
// Smarty lexer.
|
||||
|
28
vendor/github.com/alecthomas/chroma/lexers/s/systemd.go
generated
vendored
Normal file
28
vendor/github.com/alecthomas/chroma/lexers/s/systemd.go
generated
vendored
Normal file
@@ -0,0 +1,28 @@
|
||||
package s
|
||||
|
||||
import (
|
||||
. "github.com/alecthomas/chroma" // nolint
|
||||
"github.com/alecthomas/chroma/lexers/internal"
|
||||
)
|
||||
|
||||
var SYSTEMD = internal.Register(MustNewLexer(
|
||||
&Config{
|
||||
Name: "SYSTEMD",
|
||||
Aliases: []string{"systemd"},
|
||||
Filenames: []string{"*.service"},
|
||||
MimeTypes: []string{"text/plain"},
|
||||
},
|
||||
Rules{
|
||||
"root": {
|
||||
{`\s+`, Text, nil},
|
||||
{`[;#].*`, Comment, nil},
|
||||
{`\[.*?\]$`, Keyword, nil},
|
||||
{`(.*?)(=)(.*)(\\\n)`, ByGroups(NameAttribute, Operator, LiteralString, Text), Push("continuation")},
|
||||
{`(.*?)(=)(.*)`, ByGroups(NameAttribute, Operator, LiteralString), nil},
|
||||
},
|
||||
"continuation": {
|
||||
{`(.*?)(\\\n)`, ByGroups(LiteralString, Text), nil},
|
||||
{`(.*)`, LiteralString, Pop(1)},
|
||||
},
|
||||
},
|
||||
))
|
36
vendor/github.com/alecthomas/chroma/lexers/t/tradingview.go
generated
vendored
Normal file
36
vendor/github.com/alecthomas/chroma/lexers/t/tradingview.go
generated
vendored
Normal file
@@ -0,0 +1,36 @@
|
||||
package t
|
||||
|
||||
import (
|
||||
. "github.com/alecthomas/chroma" // nolint
|
||||
"github.com/alecthomas/chroma/lexers/internal"
|
||||
)
|
||||
|
||||
// TradingView lexer.
|
||||
var TradingView = internal.Register(MustNewLexer(
|
||||
&Config{
|
||||
Name: "TradingView",
|
||||
Aliases: []string{"tradingview", "tv"},
|
||||
Filenames: []string{"*.tv"},
|
||||
MimeTypes: []string{"text/x-tradingview"},
|
||||
DotAll: true,
|
||||
},
|
||||
Rules{
|
||||
"root": {
|
||||
{`[^\S\n]+|\n|[()]`, Text, nil},
|
||||
{`(//.*?)(\n)`, ByGroups(CommentSingle, Text), nil},
|
||||
{`>=|<=|==|!=|>|<|\?|-|\+|\*|\/|%|\[|\]`, Operator, nil},
|
||||
{`[:,.]`, Punctuation, nil},
|
||||
{`=`, KeywordPseudo, nil},
|
||||
{`"(\\\\|\\"|[^"\n])*["\n]`, LiteralString, nil},
|
||||
{`'\\.'|'[^\\]'`, LiteralString, nil},
|
||||
{`[0-9](\.[0-9]*)?([eE][+-][0-9]+)?`, LiteralNumber, nil},
|
||||
{`(abs|acos|alertcondition|alma|asin|atan|atr|avg|barcolor|barssince|bgcolor|cci|ceil|change|cog|correlation|cos|crossover|crossunder|cum|dev|ema|exp|falling|fill|fixnan|floor|heikinashi|highest|highestbars|hline|iff|input|kagi|linebreak|linreg|log|log10|lowest|lowestbars|macd|max|min|mom|nz|percentile_linear_interpolation|percentile_nearest_rank|percentrank|pivothigh|pivotlow|plot|plotarrow|plotbar|plotcandle|plotchar|plotshape|pointfigure|pow|renko|rising|rma|roc|round|rsi|sar|security|sign|sin|sma|sqrt|stdev|stoch|study|sum|swma|tan|tostring|tsi|valuewhen|variance|vwma|wma|strategy\.(cancel|cancel_all|close|close_all|entry|exit|order)|strategy\.risk\.(allow_entry_in|max_cons_loss_days|max_drawdown|max_intraday_filled_orders|max_intraday_loss|max_position_size))\b`, NameFunction, nil},
|
||||
{`\b(cross|dayofmonth|dayofweek|hour|minute|month|na|offset|second|tickerid|time|tr|vwap|weekofyear|year)(\()`, ByGroups(NameFunction, Text), nil}, // functions that can also be variable
|
||||
{`(accdist|aqua|area|areabr|black|blue|bool|circles|close|columns|currency\.(AUD|CAD|CHF|EUR|GBP|HKD|JPY|NOK|NONE|NZD|SEK|SGD|TRY|USD|ZAR)|dashed|dotted|float|friday|fuchsia|gray|green|high|histogram|hl2|hlc3|integer|interval|isdaily|isdwm|isintraday|ismonthly|isweekly|lime|line|linebr|location\.(abovebar|belowbar|bottom|top)|low|maroon|monday|n|navy|ohlc4|olive|open|orange|period|purple|red|resolution|saturday|scale\.(left|none|right)|session|session\.(extended|regular)|silver|size\.(auto|huge|large|normal|small|tiny)|solid|source|string|sunday|symbol|syminfo\.(mintick|pointvalue|prefix|root|session)|teal|thursday|ticker|tuesday|volume|wednesday|white|yellow|strategy\.(cash|position_size|closedtrades|direction\.(all|long|short)|equity|eventrades|fixed|grossloss|grossprofit|initial_capital|long|losstrades|max_contracts_held_all|max_contracts_held_long|max_contracts_held_short|max_drawdown|netprofit|oca\.(cancel|none|reduce)|openprofit|opentrades|percent_of_equity|position_avg_price|position_entry_name|short|wintrades)|shape\.(arrowdown|arrowup|circle|cross|diamond|flag|labeldown|labelup|square|triangledown|triangleup|xcross)|barstate\.is(first|history|last|new|realtime)|barmerge\.(gaps_on|gaps_off|lookahead_on|lookahead_off)|strategy\.commission\.(cash_per_contract|cash_per_order|percent))\b`, NameVariable, nil},
|
||||
{`(cross|dayofmonth|dayofweek|hour|minute|month|na|second|tickerid|time|tr|vwap|weekofyear|year)(\b[^\(])`, ByGroups(NameVariable, Text), nil}, // variables that can also be function
|
||||
{`(true|false)\b`, KeywordConstant, nil},
|
||||
{`(and|or|not|if|else|for|to)\b`, OperatorWord, nil},
|
||||
{`@?[_a-zA-Z]\w*`, Text, nil},
|
||||
},
|
||||
},
|
||||
))
|
73
vendor/github.com/alecthomas/chroma/lexers/v/vb.go
generated
vendored
Normal file
73
vendor/github.com/alecthomas/chroma/lexers/v/vb.go
generated
vendored
Normal file
@@ -0,0 +1,73 @@
|
||||
package v
|
||||
|
||||
import (
|
||||
. "github.com/alecthomas/chroma" // nolint
|
||||
"github.com/alecthomas/chroma/lexers/internal"
|
||||
)
|
||||
|
||||
const vbName = `[_\w][\w]*`
|
||||
|
||||
// VB.Net lexer.
|
||||
var VBNet = internal.Register(MustNewLexer(
|
||||
&Config{
|
||||
Name: "VB.net",
|
||||
Aliases: []string{"vb.net", "vbnet"},
|
||||
Filenames: []string{"*.vb", "*.bas"},
|
||||
MimeTypes: []string{"text/x-vbnet", "text/x-vba"},
|
||||
CaseInsensitive: true,
|
||||
},
|
||||
Rules{
|
||||
"root": {
|
||||
{`^\s*<.*?>`, NameAttribute, nil},
|
||||
{`\s+`, Text, nil},
|
||||
{`\n`, Text, nil},
|
||||
{`rem\b.*?\n`, Comment, nil},
|
||||
{`'.*?\n`, Comment, nil},
|
||||
{`#If\s.*?\sThen|#ElseIf\s.*?\sThen|#Else|#End\s+If|#Const|#ExternalSource.*?\n|#End\s+ExternalSource|#Region.*?\n|#End\s+Region|#ExternalChecksum`, CommentPreproc, nil},
|
||||
{`[(){}!#,.:]`, Punctuation, nil},
|
||||
{`Option\s+(Strict|Explicit|Compare)\s+(On|Off|Binary|Text)`, KeywordDeclaration, nil},
|
||||
{Words(`(?<!\.)`, `\b`, `AddHandler`, `Alias`, `ByRef`, `ByVal`, `Call`, `Case`, `Catch`, `CBool`, `CByte`, `CChar`, `CDate`, `CDec`, `CDbl`, `CInt`, `CLng`, `CObj`, `Continue`, `CSByte`, `CShort`, `CSng`, `CStr`, `CType`, `CUInt`, `CULng`, `CUShort`, `Declare`, `Default`, `Delegate`, `DirectCast`, `Do`, `Each`, `Else`, `ElseIf`, `EndIf`, `Erase`, `Error`, `Event`, `Exit`, `False`, `Finally`, `For`, `Friend`, `Get`, `Global`, `GoSub`, `GoTo`, `Handles`, `If`, `Implements`, `Inherits`, `Interface`, `Let`, `Lib`, `Loop`, `Me`, `MustInherit`, `MustOverride`, `MyBase`, `MyClass`, `Narrowing`, `New`, `Next`, `Not`, `Nothing`, `NotInheritable`, `NotOverridable`, `Of`, `On`, `Operator`, `Option`, `Optional`, `Overloads`, `Overridable`, `Overrides`, `ParamArray`, `Partial`, `Private`, `Protected`, `Public`, `RaiseEvent`, `ReadOnly`, `ReDim`, `RemoveHandler`, `Resume`, `Return`, `Select`, `Set`, `Shadows`, `Shared`, `Single`, `Static`, `Step`, `Stop`, `SyncLock`, `Then`, `Throw`, `To`, `True`, `Try`, `TryCast`, `Wend`, `Using`, `When`, `While`, `Widening`, `With`, `WithEvents`, `WriteOnly`), Keyword, nil},
|
||||
{`(?<!\.)End\b`, Keyword, Push("end")},
|
||||
{`(?<!\.)(Dim|Const)\b`, Keyword, Push("dim")},
|
||||
{`(?<!\.)(Function|Sub|Property)(\s+)`, ByGroups(Keyword, Text), Push("funcname")},
|
||||
{`(?<!\.)(Class|Structure|Enum)(\s+)`, ByGroups(Keyword, Text), Push("classname")},
|
||||
{`(?<!\.)(Module|Namespace|Imports)(\s+)`, ByGroups(Keyword, Text), Push("namespace")},
|
||||
{`(?<!\.)(Boolean|Byte|Char|Date|Decimal|Double|Integer|Long|Object|SByte|Short|Single|String|Variant|UInteger|ULong|UShort)\b`, KeywordType, nil},
|
||||
{`(?<!\.)(AddressOf|And|AndAlso|As|GetType|In|Is|IsNot|Like|Mod|Or|OrElse|TypeOf|Xor)\b`, OperatorWord, nil},
|
||||
{`&=|[*]=|/=|\\=|\^=|\+=|-=|<<=|>>=|<<|>>|:=|<=|>=|<>|[-&*/\\^+=<>\[\]]`, Operator, nil},
|
||||
{`"`, LiteralString, Push("string")},
|
||||
{`_\n`, Text, nil},
|
||||
{vbName, Name, nil},
|
||||
{`#.*?#`, LiteralDate, nil},
|
||||
{`(\d+\.\d*|\d*\.\d+)(F[+-]?[0-9]+)?`, LiteralNumberFloat, nil},
|
||||
{`\d+([SILDFR]|US|UI|UL)?`, LiteralNumberInteger, nil},
|
||||
{`&H[0-9a-f]+([SILDFR]|US|UI|UL)?`, LiteralNumberInteger, nil},
|
||||
{`&O[0-7]+([SILDFR]|US|UI|UL)?`, LiteralNumberInteger, nil},
|
||||
},
|
||||
"string": {
|
||||
{`""`, LiteralString, nil},
|
||||
{`"C?`, LiteralString, Pop(1)},
|
||||
{`[^"]+`, LiteralString, nil},
|
||||
},
|
||||
"dim": {
|
||||
{vbName, NameVariable, Pop(1)},
|
||||
Default(Pop(1)),
|
||||
},
|
||||
"funcname": {
|
||||
{vbName, NameFunction, Pop(1)},
|
||||
},
|
||||
"classname": {
|
||||
{vbName, NameClass, Pop(1)},
|
||||
},
|
||||
"namespace": {
|
||||
{vbName, NameNamespace, nil},
|
||||
{`\.`, NameNamespace, nil},
|
||||
Default(Pop(1)),
|
||||
},
|
||||
"end": {
|
||||
{`\s+`, Text, nil},
|
||||
{`(Function|Sub|Property|Class|Structure|Enum|Module|Namespace)\b`, Keyword, Pop(1)},
|
||||
Default(Pop(1)),
|
||||
},
|
||||
},
|
||||
))
|
2
vendor/github.com/alecthomas/chroma/lexers/y/yaml.go
generated
vendored
2
vendor/github.com/alecthomas/chroma/lexers/y/yaml.go
generated
vendored
@@ -20,7 +20,7 @@ var YAML = internal.Register(MustNewLexer(
|
||||
{`&[^\s]+`, CommentPreproc, nil},
|
||||
{`\*[^\s]+`, CommentPreproc, nil},
|
||||
{`^%include\s+[^\n\r]+`, CommentPreproc, nil},
|
||||
{`([>|])(\s+)((?:(?:.*?$)(?:[\n\r]*?\2)?)*)`, ByGroups(StringDoc, StringDoc, StringDoc), nil},
|
||||
{`([>|+-]\s+)(\s+)((?:(?:.*?$)(?:[\n\r]*?)?)*)`, ByGroups(StringDoc, StringDoc, StringDoc), nil},
|
||||
Include("value"),
|
||||
{`[?:,\[\]]`, Punctuation, nil},
|
||||
{`.`, Text, nil},
|
||||
|
9
vendor/github.com/alecthomas/chroma/mutators.go
generated
vendored
9
vendor/github.com/alecthomas/chroma/mutators.go
generated
vendored
@@ -120,3 +120,12 @@ func Pop(n int) MutatorFunc {
|
||||
func Default(mutators ...Mutator) Rule {
|
||||
return Rule{Mutator: Mutators(mutators...)}
|
||||
}
|
||||
|
||||
// Stringify returns the raw string for a set of tokens.
|
||||
func Stringify(tokens ...Token) string {
|
||||
out := []string{}
|
||||
for _, t := range tokens {
|
||||
out = append(out, t.Value)
|
||||
}
|
||||
return strings.Join(out, "")
|
||||
}
|
||||
|
94
vendor/github.com/alecthomas/chroma/regexp.go
generated
vendored
94
vendor/github.com/alecthomas/chroma/regexp.go
generated
vendored
@@ -33,7 +33,7 @@ func (e EmitterFunc) Emit(groups []string, lexer Lexer) Iterator { return e(grou
|
||||
func ByGroups(emitters ...Emitter) Emitter {
|
||||
return EmitterFunc(func(groups []string, lexer Lexer) Iterator {
|
||||
iterators := make([]Iterator, 0, len(groups)-1)
|
||||
// NOTE: If this panics, there is a mismatch with groups. Uncomment the following line to debug.
|
||||
// NOTE: If this panics, there is a mismatch with groups
|
||||
for i, group := range groups[1:] {
|
||||
iterators = append(iterators, emitters[i].Emit([]string{group}, lexer))
|
||||
}
|
||||
@@ -41,6 +41,74 @@ func ByGroups(emitters ...Emitter) Emitter {
|
||||
})
|
||||
}
|
||||
|
||||
// UsingByGroup emits tokens for the matched groups in the regex using a
|
||||
// "sublexer". Used when lexing code blocks where the name of a sublexer is
|
||||
// contained within the block, for example on a Markdown text block or SQL
|
||||
// language block.
|
||||
//
|
||||
// The sublexer will be retrieved using sublexerGetFunc (typically
|
||||
// internal.Get), using the captured value from the matched sublexerNameGroup.
|
||||
//
|
||||
// If sublexerGetFunc returns a non-nil lexer for the captured sublexerNameGroup,
|
||||
// then tokens for the matched codeGroup will be emitted using the retrieved
|
||||
// lexer. Otherwise, if the sublexer is nil, then tokens will be emitted from
|
||||
// the passed emitter.
|
||||
//
|
||||
// Example:
|
||||
//
|
||||
// var Markdown = internal.Register(MustNewLexer(
|
||||
// &Config{
|
||||
// Name: "markdown",
|
||||
// Aliases: []string{"md", "mkd"},
|
||||
// Filenames: []string{"*.md", "*.mkd", "*.markdown"},
|
||||
// MimeTypes: []string{"text/x-markdown"},
|
||||
// },
|
||||
// Rules{
|
||||
// "root": {
|
||||
// {"^(```)(\\w+)(\\n)([\\w\\W]*?)(^```$)",
|
||||
// UsingByGroup(
|
||||
// internal.Get,
|
||||
// 2, 4,
|
||||
// String, String, String, Text, String,
|
||||
// ),
|
||||
// nil,
|
||||
// },
|
||||
// },
|
||||
// },
|
||||
// ))
|
||||
//
|
||||
// See the lexers/m/markdown.go for the complete example.
|
||||
//
|
||||
// Note: panic's if the number emitters does not equal the number of matched
|
||||
// groups in the regex.
|
||||
func UsingByGroup(sublexerGetFunc func(string) Lexer, sublexerNameGroup, codeGroup int, emitters ...Emitter) Emitter {
|
||||
return EmitterFunc(func(groups []string, lexer Lexer) Iterator {
|
||||
// bounds check
|
||||
if len(emitters) != len(groups)-1 {
|
||||
panic("UsingByGroup expects number of emitters to be the same as len(groups)-1")
|
||||
}
|
||||
|
||||
// grab sublexer
|
||||
sublexer := sublexerGetFunc(groups[sublexerNameGroup])
|
||||
|
||||
// build iterators
|
||||
iterators := make([]Iterator, len(groups)-1)
|
||||
for i, group := range groups[1:] {
|
||||
if i == codeGroup-1 && sublexer != nil {
|
||||
var err error
|
||||
iterators[i], err = sublexer.Tokenise(nil, groups[codeGroup])
|
||||
if err != nil {
|
||||
panic(err)
|
||||
}
|
||||
} else {
|
||||
iterators[i] = emitters[i].Emit([]string{group}, lexer)
|
||||
}
|
||||
}
|
||||
|
||||
return Concaterator(iterators...)
|
||||
})
|
||||
}
|
||||
|
||||
// Using returns an Emitter that uses a given Lexer for parsing and emitting.
|
||||
func Using(lexer Lexer) Emitter {
|
||||
return EmitterFunc(func(groups []string, _ Lexer) Iterator {
|
||||
@@ -72,13 +140,13 @@ func Words(prefix, suffix string, words ...string) string {
|
||||
}
|
||||
|
||||
// Tokenise text using lexer, returning tokens as a slice.
|
||||
func Tokenise(lexer Lexer, options *TokeniseOptions, text string) ([]*Token, error) {
|
||||
out := []*Token{}
|
||||
func Tokenise(lexer Lexer, options *TokeniseOptions, text string) ([]Token, error) {
|
||||
var out []Token
|
||||
it, err := lexer.Tokenise(options, text)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
for t := it(); t != nil; t = it() {
|
||||
for t := it(); t != EOF; t = it() {
|
||||
out = append(out, t)
|
||||
}
|
||||
return out, nil
|
||||
@@ -178,13 +246,13 @@ func (l *LexerState) Get(key interface{}) interface{} {
|
||||
return l.MutatorContext[key]
|
||||
}
|
||||
|
||||
func (l *LexerState) Iterator() *Token {
|
||||
func (l *LexerState) Iterator() Token {
|
||||
for l.Pos < len(l.Text) && len(l.Stack) > 0 {
|
||||
// Exhaust the iterator stack, if any.
|
||||
for len(l.iteratorStack) > 0 {
|
||||
n := len(l.iteratorStack) - 1
|
||||
t := l.iteratorStack[n]()
|
||||
if t == nil {
|
||||
if t == EOF {
|
||||
l.iteratorStack = l.iteratorStack[:n]
|
||||
continue
|
||||
}
|
||||
@@ -195,11 +263,15 @@ func (l *LexerState) Iterator() *Token {
|
||||
if l.Lexer.trace {
|
||||
fmt.Fprintf(os.Stderr, "%s: pos=%d, text=%q\n", l.State, l.Pos, string(l.Text[l.Pos:]))
|
||||
}
|
||||
ruleIndex, rule, groups := matchRules(l.Text[l.Pos:], l.Rules[l.State])
|
||||
selectedRule, ok := l.Rules[l.State]
|
||||
if !ok {
|
||||
panic("unknown state " + l.State)
|
||||
}
|
||||
ruleIndex, rule, groups := matchRules(l.Text[l.Pos:], selectedRule)
|
||||
// No match.
|
||||
if groups == nil {
|
||||
l.Pos++
|
||||
return &Token{Error, string(l.Text[l.Pos-1 : l.Pos])}
|
||||
return Token{Error, string(l.Text[l.Pos-1 : l.Pos])}
|
||||
}
|
||||
l.Rule = ruleIndex
|
||||
l.Groups = groups
|
||||
@@ -218,7 +290,7 @@ func (l *LexerState) Iterator() *Token {
|
||||
for len(l.iteratorStack) > 0 {
|
||||
n := len(l.iteratorStack) - 1
|
||||
t := l.iteratorStack[n]()
|
||||
if t == nil {
|
||||
if t == EOF {
|
||||
l.iteratorStack = l.iteratorStack[:n]
|
||||
continue
|
||||
}
|
||||
@@ -229,9 +301,9 @@ func (l *LexerState) Iterator() *Token {
|
||||
if l.Pos != len(l.Text) && len(l.Stack) == 0 {
|
||||
value := string(l.Text[l.Pos:])
|
||||
l.Pos = len(l.Text)
|
||||
return &Token{Type: Error, Value: value}
|
||||
return Token{Type: Error, Value: value}
|
||||
}
|
||||
return nil
|
||||
return EOF
|
||||
}
|
||||
|
||||
type RegexLexer struct {
|
||||
|
24
vendor/github.com/alecthomas/chroma/remap.go
generated
vendored
24
vendor/github.com/alecthomas/chroma/remap.go
generated
vendored
@@ -2,11 +2,11 @@ package chroma
|
||||
|
||||
type remappingLexer struct {
|
||||
lexer Lexer
|
||||
mapper func(*Token) []*Token
|
||||
mapper func(Token) []Token
|
||||
}
|
||||
|
||||
// RemappingLexer remaps a token to a set of, potentially empty, tokens.
|
||||
func RemappingLexer(lexer Lexer, mapper func(*Token) []*Token) Lexer {
|
||||
func RemappingLexer(lexer Lexer, mapper func(Token) []Token) Lexer {
|
||||
return &remappingLexer{lexer, mapper}
|
||||
}
|
||||
|
||||
@@ -19,8 +19,8 @@ func (r *remappingLexer) Tokenise(options *TokeniseOptions, text string) (Iterat
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
buffer := []*Token{}
|
||||
return func() *Token {
|
||||
var buffer []Token
|
||||
return func() Token {
|
||||
for {
|
||||
if len(buffer) > 0 {
|
||||
t := buffer[0]
|
||||
@@ -28,7 +28,7 @@ func (r *remappingLexer) Tokenise(options *TokeniseOptions, text string) (Iterat
|
||||
return t
|
||||
}
|
||||
t := it()
|
||||
if t == nil {
|
||||
if t == EOF {
|
||||
return t
|
||||
}
|
||||
buffer = r.mapper(t)
|
||||
@@ -58,17 +58,23 @@ func TypeRemappingLexer(lexer Lexer, mapping TypeMapping) Lexer {
|
||||
km = map[string]TokenType{}
|
||||
lut[rt.From] = km
|
||||
}
|
||||
for _, k := range rt.Words {
|
||||
km[k] = rt.To
|
||||
if len(rt.Words) == 0 {
|
||||
km[""] = rt.To
|
||||
} else {
|
||||
for _, k := range rt.Words {
|
||||
km[k] = rt.To
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
return RemappingLexer(lexer, func(t *Token) []*Token {
|
||||
return RemappingLexer(lexer, func(t Token) []Token {
|
||||
if k, ok := lut[t.Type]; ok {
|
||||
if tt, ok := k[t.Value]; ok {
|
||||
t.Type = tt
|
||||
} else if tt, ok := k[""]; ok {
|
||||
t.Type = tt
|
||||
}
|
||||
}
|
||||
return []*Token{t}
|
||||
return []Token{t}
|
||||
})
|
||||
}
|
||||
|
1
vendor/github.com/alecthomas/chroma/styles/autumn.go
generated
vendored
1
vendor/github.com/alecthomas/chroma/styles/autumn.go
generated
vendored
@@ -37,6 +37,7 @@ var Autumn = Register(chroma.MustNewStyle("autumn", chroma.StyleEntries{
|
||||
chroma.GenericPrompt: "#555555",
|
||||
chroma.GenericOutput: "#888888",
|
||||
chroma.GenericTraceback: "#aa0000",
|
||||
chroma.GenericUnderline: "underline",
|
||||
chroma.Error: "#F00 bg:#FAA",
|
||||
chroma.Background: " bg:#ffffff",
|
||||
}))
|
||||
|
1
vendor/github.com/alecthomas/chroma/styles/borland.go
generated
vendored
1
vendor/github.com/alecthomas/chroma/styles/borland.go
generated
vendored
@@ -27,6 +27,7 @@ var Borland = Register(chroma.MustNewStyle("borland", chroma.StyleEntries{
|
||||
chroma.GenericPrompt: "#555555",
|
||||
chroma.GenericOutput: "#888888",
|
||||
chroma.GenericTraceback: "#aa0000",
|
||||
chroma.GenericUnderline: "underline",
|
||||
chroma.Error: "bg:#e3d2d2 #a61717",
|
||||
chroma.Background: " bg:#ffffff",
|
||||
}))
|
||||
|
1
vendor/github.com/alecthomas/chroma/styles/colorful.go
generated
vendored
1
vendor/github.com/alecthomas/chroma/styles/colorful.go
generated
vendored
@@ -53,6 +53,7 @@ var Colorful = Register(chroma.MustNewStyle("colorful", chroma.StyleEntries{
|
||||
chroma.GenericPrompt: "bold #c65d09",
|
||||
chroma.GenericOutput: "#888",
|
||||
chroma.GenericTraceback: "#04D",
|
||||
chroma.GenericUnderline: "underline",
|
||||
chroma.Error: "#F00 bg:#FAA",
|
||||
chroma.Background: " bg:#ffffff",
|
||||
}))
|
||||
|
1
vendor/github.com/alecthomas/chroma/styles/dracula.go
generated
vendored
1
vendor/github.com/alecthomas/chroma/styles/dracula.go
generated
vendored
@@ -23,6 +23,7 @@ var Dracula = Register(chroma.MustNewStyle("dracula", chroma.StyleEntries{
|
||||
chroma.GenericStrong: "#f8f8f2",
|
||||
chroma.GenericSubheading: "#f8f8f2 bold",
|
||||
chroma.GenericTraceback: "#f8f8f2",
|
||||
chroma.GenericUnderline: "underline",
|
||||
chroma.Error: "#f8f8f2",
|
||||
chroma.Keyword: "#ff79c6",
|
||||
chroma.KeywordConstant: "#ff79c6",
|
||||
|
1
vendor/github.com/alecthomas/chroma/styles/emacs.go
generated
vendored
1
vendor/github.com/alecthomas/chroma/styles/emacs.go
generated
vendored
@@ -45,6 +45,7 @@ var Emacs = Register(chroma.MustNewStyle("emacs", chroma.StyleEntries{
|
||||
chroma.GenericPrompt: "bold #000080",
|
||||
chroma.GenericOutput: "#888",
|
||||
chroma.GenericTraceback: "#04D",
|
||||
chroma.GenericUnderline: "underline",
|
||||
chroma.Error: "border:#FF0000",
|
||||
chroma.Background: " bg:#f8f8f8",
|
||||
}))
|
||||
|
1
vendor/github.com/alecthomas/chroma/styles/friendly.go
generated
vendored
1
vendor/github.com/alecthomas/chroma/styles/friendly.go
generated
vendored
@@ -45,6 +45,7 @@ var Friendly = Register(chroma.MustNewStyle("friendly", chroma.StyleEntries{
|
||||
chroma.GenericPrompt: "bold #c65d09",
|
||||
chroma.GenericOutput: "#888",
|
||||
chroma.GenericTraceback: "#04D",
|
||||
chroma.GenericUnderline: "underline",
|
||||
chroma.Error: "border:#FF0000",
|
||||
chroma.Background: " bg:#f0f0f0",
|
||||
}))
|
||||
|
1
vendor/github.com/alecthomas/chroma/styles/github.go
generated
vendored
1
vendor/github.com/alecthomas/chroma/styles/github.go
generated
vendored
@@ -22,6 +22,7 @@ var GitHub = Register(chroma.MustNewStyle("github", chroma.StyleEntries{
|
||||
chroma.GenericStrong: "bold",
|
||||
chroma.GenericSubheading: "#aaaaaa",
|
||||
chroma.GenericTraceback: "#aa0000",
|
||||
chroma.GenericUnderline: "underline",
|
||||
chroma.KeywordType: "bold #445588",
|
||||
chroma.Keyword: "bold #000000",
|
||||
chroma.LiteralNumber: "#009999",
|
||||
|
1
vendor/github.com/alecthomas/chroma/styles/lovelace.go
generated
vendored
1
vendor/github.com/alecthomas/chroma/styles/lovelace.go
generated
vendored
@@ -54,6 +54,7 @@ var Lovelace = Register(chroma.MustNewStyle("lovelace", chroma.StyleEntries{
|
||||
chroma.GenericPrompt: "#444444",
|
||||
chroma.GenericStrong: "bold",
|
||||
chroma.GenericTraceback: "#2838b0",
|
||||
chroma.GenericUnderline: "underline",
|
||||
chroma.Error: "bg:#a848a8",
|
||||
chroma.Background: " bg:#ffffff",
|
||||
}))
|
||||
|
1
vendor/github.com/alecthomas/chroma/styles/manni.go
generated
vendored
1
vendor/github.com/alecthomas/chroma/styles/manni.go
generated
vendored
@@ -45,6 +45,7 @@ var Manni = Register(chroma.MustNewStyle("manni", chroma.StyleEntries{
|
||||
chroma.GenericPrompt: "bold #000099",
|
||||
chroma.GenericOutput: "#AAAAAA",
|
||||
chroma.GenericTraceback: "#99CC66",
|
||||
chroma.GenericUnderline: "underline",
|
||||
chroma.Error: "bg:#FFAAAA #AA0000",
|
||||
chroma.Background: " bg:#f0f3f3",
|
||||
}))
|
||||
|
1
vendor/github.com/alecthomas/chroma/styles/murphy.go
generated
vendored
1
vendor/github.com/alecthomas/chroma/styles/murphy.go
generated
vendored
@@ -53,6 +53,7 @@ var Murphy = Register(chroma.MustNewStyle("murphy", chroma.StyleEntries{
|
||||
chroma.GenericPrompt: "bold #c65d09",
|
||||
chroma.GenericOutput: "#888",
|
||||
chroma.GenericTraceback: "#04D",
|
||||
chroma.GenericUnderline: "underline",
|
||||
chroma.Error: "#F00 bg:#FAA",
|
||||
chroma.Background: " bg:#ffffff",
|
||||
}))
|
||||
|
1
vendor/github.com/alecthomas/chroma/styles/native.go
generated
vendored
1
vendor/github.com/alecthomas/chroma/styles/native.go
generated
vendored
@@ -37,5 +37,6 @@ var Native = Register(chroma.MustNewStyle("native", chroma.StyleEntries{
|
||||
chroma.GenericPrompt: "#aaaaaa",
|
||||
chroma.GenericOutput: "#cccccc",
|
||||
chroma.GenericTraceback: "#d22323",
|
||||
chroma.GenericUnderline: "underline",
|
||||
chroma.Error: "bg:#e3d2d2 #a61717",
|
||||
}))
|
||||
|
1
vendor/github.com/alecthomas/chroma/styles/pastie.go
generated
vendored
1
vendor/github.com/alecthomas/chroma/styles/pastie.go
generated
vendored
@@ -46,6 +46,7 @@ var Pastie = Register(chroma.MustNewStyle("pastie", chroma.StyleEntries{
|
||||
chroma.GenericPrompt: "#555555",
|
||||
chroma.GenericOutput: "#888888",
|
||||
chroma.GenericTraceback: "#aa0000",
|
||||
chroma.GenericUnderline: "underline",
|
||||
chroma.Error: "bg:#e3d2d2 #a61717",
|
||||
chroma.Background: " bg:#ffffff",
|
||||
}))
|
||||
|
1
vendor/github.com/alecthomas/chroma/styles/perldoc.go
generated
vendored
1
vendor/github.com/alecthomas/chroma/styles/perldoc.go
generated
vendored
@@ -38,6 +38,7 @@ var Perldoc = Register(chroma.MustNewStyle("perldoc", chroma.StyleEntries{
|
||||
chroma.GenericPrompt: "#555555",
|
||||
chroma.GenericOutput: "#888888",
|
||||
chroma.GenericTraceback: "#aa0000",
|
||||
chroma.GenericUnderline: "underline",
|
||||
chroma.Error: "bg:#e3d2d2 #a61717",
|
||||
chroma.Background: " bg:#eeeedd",
|
||||
}))
|
||||
|
1
vendor/github.com/alecthomas/chroma/styles/pygments.go
generated
vendored
1
vendor/github.com/alecthomas/chroma/styles/pygments.go
generated
vendored
@@ -49,6 +49,7 @@ var Pygments = Register(chroma.MustNewStyle("pygments", map[chroma.TokenType]str
|
||||
chroma.GenericPrompt: "bold #000080",
|
||||
chroma.GenericOutput: "#888",
|
||||
chroma.GenericTraceback: "#04D",
|
||||
chroma.GenericUnderline: "underline",
|
||||
|
||||
chroma.Error: "border:#FF0000",
|
||||
}))
|
||||
|
1
vendor/github.com/alecthomas/chroma/styles/rainbow_dash.go
generated
vendored
1
vendor/github.com/alecthomas/chroma/styles/rainbow_dash.go
generated
vendored
@@ -20,6 +20,7 @@ var RainbowDash = Register(chroma.MustNewStyle("rainbow_dash", chroma.StyleEntri
|
||||
chroma.GenericStrong: "bold",
|
||||
chroma.GenericSubheading: "bold #2c5dcd",
|
||||
chroma.GenericTraceback: "#c5060b",
|
||||
chroma.GenericUnderline: "underline",
|
||||
chroma.Keyword: "bold #2c5dcd",
|
||||
chroma.KeywordPseudo: "nobold",
|
||||
chroma.KeywordType: "#5918bb",
|
||||
|
46
vendor/github.com/alecthomas/chroma/styles/solarized-dark.go
generated
vendored
Normal file
46
vendor/github.com/alecthomas/chroma/styles/solarized-dark.go
generated
vendored
Normal file
@@ -0,0 +1,46 @@
|
||||
package styles
|
||||
|
||||
import (
|
||||
"github.com/alecthomas/chroma"
|
||||
)
|
||||
|
||||
// SolarizedDark style.
|
||||
var SolarizedDark = Register(chroma.MustNewStyle("solarized-dark", chroma.StyleEntries{
|
||||
chroma.Keyword: "#719e07",
|
||||
chroma.KeywordConstant: "#CB4B16",
|
||||
chroma.KeywordDeclaration: "#268BD2",
|
||||
chroma.KeywordReserved: "#268BD2",
|
||||
chroma.KeywordType: "#DC322F",
|
||||
chroma.NameAttribute: "#93A1A1",
|
||||
chroma.NameBuiltin: "#B58900",
|
||||
chroma.NameBuiltinPseudo: "#268BD2",
|
||||
chroma.NameClass: "#268BD2",
|
||||
chroma.NameConstant: "#CB4B16",
|
||||
chroma.NameDecorator: "#268BD2",
|
||||
chroma.NameEntity: "#CB4B16",
|
||||
chroma.NameException: "#CB4B16",
|
||||
chroma.NameFunction: "#268BD2",
|
||||
chroma.NameTag: "#268BD2",
|
||||
chroma.NameVariable: "#268BD2",
|
||||
chroma.LiteralString: "#2AA198",
|
||||
chroma.LiteralStringBacktick: "#586E75",
|
||||
chroma.LiteralStringChar: "#2AA198",
|
||||
chroma.LiteralStringDoc: "#93A1A1",
|
||||
chroma.LiteralStringEscape: "#CB4B16",
|
||||
chroma.LiteralStringHeredoc: "#93A1A1",
|
||||
chroma.LiteralStringRegex: "#DC322F",
|
||||
chroma.LiteralNumber: "#2AA198",
|
||||
chroma.Operator: "#719e07",
|
||||
chroma.Comment: "#586E75",
|
||||
chroma.CommentPreproc: "#719e07",
|
||||
chroma.CommentSpecial: "#719e07",
|
||||
chroma.GenericDeleted: "#DC322F",
|
||||
chroma.GenericEmph: "italic",
|
||||
chroma.GenericError: "#DC322F bold",
|
||||
chroma.GenericHeading: "#CB4B16",
|
||||
chroma.GenericInserted: "#719e07",
|
||||
chroma.GenericStrong: "bold",
|
||||
chroma.GenericSubheading: "#268BD2",
|
||||
chroma.Background: "#93A1A1 bg:#002B36",
|
||||
chroma.Other: "#CB4B16",
|
||||
}))
|
48
vendor/github.com/alecthomas/chroma/styles/solarized-dark256.go
generated
vendored
Normal file
48
vendor/github.com/alecthomas/chroma/styles/solarized-dark256.go
generated
vendored
Normal file
@@ -0,0 +1,48 @@
|
||||
package styles
|
||||
|
||||
import (
|
||||
"github.com/alecthomas/chroma"
|
||||
)
|
||||
|
||||
// SolarizedDark256 style.
|
||||
var SolarizedDark256 = Register(chroma.MustNewStyle("solarized-dark256", chroma.StyleEntries{
|
||||
chroma.Keyword: "#5f8700",
|
||||
chroma.KeywordConstant: "#d75f00",
|
||||
chroma.KeywordDeclaration: "#0087ff",
|
||||
chroma.KeywordNamespace: "#d75f00",
|
||||
chroma.KeywordReserved: "#0087ff",
|
||||
chroma.KeywordType: "#af0000",
|
||||
chroma.NameAttribute: "#8a8a8a",
|
||||
chroma.NameBuiltin: "#0087ff",
|
||||
chroma.NameBuiltinPseudo: "#0087ff",
|
||||
chroma.NameClass: "#0087ff",
|
||||
chroma.NameConstant: "#d75f00",
|
||||
chroma.NameDecorator: "#0087ff",
|
||||
chroma.NameEntity: "#d75f00",
|
||||
chroma.NameException: "#af8700",
|
||||
chroma.NameFunction: "#0087ff",
|
||||
chroma.NameTag: "#0087ff",
|
||||
chroma.NameVariable: "#0087ff",
|
||||
chroma.LiteralString: "#00afaf",
|
||||
chroma.LiteralStringBacktick: "#4e4e4e",
|
||||
chroma.LiteralStringChar: "#00afaf",
|
||||
chroma.LiteralStringDoc: "#00afaf",
|
||||
chroma.LiteralStringEscape: "#af0000",
|
||||
chroma.LiteralStringHeredoc: "#00afaf",
|
||||
chroma.LiteralStringRegex: "#af0000",
|
||||
chroma.LiteralNumber: "#00afaf",
|
||||
chroma.Operator: "#8a8a8a",
|
||||
chroma.OperatorWord: "#5f8700",
|
||||
chroma.Comment: "#4e4e4e",
|
||||
chroma.CommentPreproc: "#5f8700",
|
||||
chroma.CommentSpecial: "#5f8700",
|
||||
chroma.GenericDeleted: "#af0000",
|
||||
chroma.GenericEmph: "italic",
|
||||
chroma.GenericError: "#af0000 bold",
|
||||
chroma.GenericHeading: "#d75f00",
|
||||
chroma.GenericInserted: "#5f8700",
|
||||
chroma.GenericStrong: "bold",
|
||||
chroma.GenericSubheading: "#0087ff",
|
||||
chroma.Background: "#8a8a8a bg:#1c1c1c",
|
||||
chroma.Other: "#d75f00",
|
||||
}))
|
24
vendor/github.com/alecthomas/chroma/styles/solarized-light.go
generated
vendored
Normal file
24
vendor/github.com/alecthomas/chroma/styles/solarized-light.go
generated
vendored
Normal file
@@ -0,0 +1,24 @@
|
||||
package styles
|
||||
|
||||
import (
|
||||
"github.com/alecthomas/chroma"
|
||||
)
|
||||
|
||||
// SolarizedLight style.
|
||||
var SolarizedLight = Register(chroma.MustNewStyle("solarized-light", chroma.StyleEntries{
|
||||
chroma.Text: "bg: #eee8d5 #586e75",
|
||||
chroma.Keyword: "#859900",
|
||||
chroma.KeywordConstant: "bold",
|
||||
chroma.KeywordNamespace: "#dc322f bold",
|
||||
chroma.KeywordType: "bold",
|
||||
chroma.Name: "#268bd2",
|
||||
chroma.NameBuiltin: "#cb4b16",
|
||||
chroma.NameClass: "#cb4b16",
|
||||
chroma.NameTag: "bold",
|
||||
chroma.Literal: "#2aa198",
|
||||
chroma.LiteralNumber: "bold",
|
||||
chroma.OperatorWord: "#859900",
|
||||
chroma.Comment: "#93a1a1 italic",
|
||||
chroma.Generic: "#d33682",
|
||||
chroma.Background: " bg:#eee8d5",
|
||||
}))
|
1
vendor/github.com/alecthomas/chroma/styles/tango.go
generated
vendored
1
vendor/github.com/alecthomas/chroma/styles/tango.go
generated
vendored
@@ -74,5 +74,6 @@ var Tango = Register(chroma.MustNewStyle("tango", chroma.StyleEntries{
|
||||
chroma.GenericStrong: "bold #000000",
|
||||
chroma.GenericSubheading: "bold #800080",
|
||||
chroma.GenericTraceback: "bold #a40000",
|
||||
chroma.GenericUnderline: "underline",
|
||||
chroma.Background: " bg:#f8f8f8",
|
||||
}))
|
||||
|
1
vendor/github.com/alecthomas/chroma/styles/trac.go
generated
vendored
1
vendor/github.com/alecthomas/chroma/styles/trac.go
generated
vendored
@@ -36,6 +36,7 @@ var Trac = Register(chroma.MustNewStyle("trac", chroma.StyleEntries{
|
||||
chroma.GenericPrompt: "#555555",
|
||||
chroma.GenericOutput: "#888888",
|
||||
chroma.GenericTraceback: "#aa0000",
|
||||
chroma.GenericUnderline: "underline",
|
||||
chroma.Error: "bg:#e3d2d2 #a61717",
|
||||
chroma.Background: " bg:#ffffff",
|
||||
}))
|
||||
|
1
vendor/github.com/alecthomas/chroma/styles/vim.go
generated
vendored
1
vendor/github.com/alecthomas/chroma/styles/vim.go
generated
vendored
@@ -31,5 +31,6 @@ var Vim = Register(chroma.MustNewStyle("vim", chroma.StyleEntries{
|
||||
chroma.GenericPrompt: "bold #000080",
|
||||
chroma.GenericOutput: "#888",
|
||||
chroma.GenericTraceback: "#04D",
|
||||
chroma.GenericUnderline: "underline",
|
||||
chroma.Error: "border:#FF0000",
|
||||
}))
|
||||
|
177
vendor/github.com/alecthomas/chroma/tokentype_string.go
generated
vendored
177
vendor/github.com/alecthomas/chroma/tokentype_string.go
generated
vendored
@@ -2,9 +2,9 @@
|
||||
|
||||
package chroma
|
||||
|
||||
import "fmt"
|
||||
import "strconv"
|
||||
|
||||
const _TokenType_name = "NoneOtherErrorLineTableTDLineTableLineHighlightLineNumbersTableLineNumbersBackgroundKeywordKeywordConstantKeywordDeclarationKeywordNamespaceKeywordPseudoKeywordReservedKeywordTypeNameNameAttributeNameBuiltinNameBuiltinPseudoNameClassNameConstantNameDecoratorNameEntityNameExceptionNameFunctionNameFunctionMagicNameKeywordNameLabelNameNamespaceNameOperatorNameOtherNamePseudoNamePropertyNameTagNameVariableNameVariableAnonymousNameVariableClassNameVariableGlobalNameVariableInstanceNameVariableMagicLiteralLiteralDateLiteralOtherLiteralStringLiteralStringAffixLiteralStringAtomLiteralStringBacktickLiteralStringBooleanLiteralStringCharLiteralStringDelimiterLiteralStringDocLiteralStringDoubleLiteralStringEscapeLiteralStringHeredocLiteralStringInterpolLiteralStringNameLiteralStringOtherLiteralStringRegexLiteralStringSingleLiteralStringSymbolLiteralNumberLiteralNumberBinLiteralNumberFloatLiteralNumberHexLiteralNumberIntegerLiteralNumberIntegerLongLiteralNumberOctOperatorOperatorWordPunctuationCommentCommentHashbangCommentMultilineCommentSingleCommentSpecialCommentPreprocCommentPreprocFileGenericGenericDeletedGenericEmphGenericErrorGenericHeadingGenericInsertedGenericOutputGenericPromptGenericStrongGenericSubheadingGenericTracebackGenericUnderlineTextTextWhitespaceTextSymbolTextPunctuation"
|
||||
const _TokenType_name = "NoneOtherErrorLineTableTDLineTableLineHighlightLineNumbersTableLineNumbersBackgroundEOFTypeKeywordKeywordConstantKeywordDeclarationKeywordNamespaceKeywordPseudoKeywordReservedKeywordTypeNameNameAttributeNameBuiltinNameBuiltinPseudoNameClassNameConstantNameDecoratorNameEntityNameExceptionNameFunctionNameFunctionMagicNameKeywordNameLabelNameNamespaceNameOperatorNameOtherNamePseudoNamePropertyNameTagNameVariableNameVariableAnonymousNameVariableClassNameVariableGlobalNameVariableInstanceNameVariableMagicLiteralLiteralDateLiteralOtherLiteralStringLiteralStringAffixLiteralStringAtomLiteralStringBacktickLiteralStringBooleanLiteralStringCharLiteralStringDelimiterLiteralStringDocLiteralStringDoubleLiteralStringEscapeLiteralStringHeredocLiteralStringInterpolLiteralStringNameLiteralStringOtherLiteralStringRegexLiteralStringSingleLiteralStringSymbolLiteralNumberLiteralNumberBinLiteralNumberFloatLiteralNumberHexLiteralNumberIntegerLiteralNumberIntegerLongLiteralNumberOctOperatorOperatorWordPunctuationCommentCommentHashbangCommentMultilineCommentSingleCommentSpecialCommentPreprocCommentPreprocFileGenericGenericDeletedGenericEmphGenericErrorGenericHeadingGenericInsertedGenericOutputGenericPromptGenericStrongGenericSubheadingGenericTracebackGenericUnderlineTextTextWhitespaceTextSymbolTextPunctuation"
|
||||
|
||||
var _TokenType_map = map[TokenType]string{
|
||||
-9: _TokenType_name[0:4],
|
||||
@@ -16,96 +16,97 @@ var _TokenType_map = map[TokenType]string{
|
||||
-3: _TokenType_name[47:63],
|
||||
-2: _TokenType_name[63:74],
|
||||
-1: _TokenType_name[74:84],
|
||||
1000: _TokenType_name[84:91],
|
||||
1001: _TokenType_name[91:106],
|
||||
1002: _TokenType_name[106:124],
|
||||
1003: _TokenType_name[124:140],
|
||||
1004: _TokenType_name[140:153],
|
||||
1005: _TokenType_name[153:168],
|
||||
1006: _TokenType_name[168:179],
|
||||
2000: _TokenType_name[179:183],
|
||||
2001: _TokenType_name[183:196],
|
||||
2002: _TokenType_name[196:207],
|
||||
2003: _TokenType_name[207:224],
|
||||
2004: _TokenType_name[224:233],
|
||||
2005: _TokenType_name[233:245],
|
||||
2006: _TokenType_name[245:258],
|
||||
2007: _TokenType_name[258:268],
|
||||
2008: _TokenType_name[268:281],
|
||||
2009: _TokenType_name[281:293],
|
||||
2010: _TokenType_name[293:310],
|
||||
2011: _TokenType_name[310:321],
|
||||
2012: _TokenType_name[321:330],
|
||||
2013: _TokenType_name[330:343],
|
||||
2014: _TokenType_name[343:355],
|
||||
2015: _TokenType_name[355:364],
|
||||
2016: _TokenType_name[364:374],
|
||||
2017: _TokenType_name[374:386],
|
||||
2018: _TokenType_name[386:393],
|
||||
2019: _TokenType_name[393:405],
|
||||
2020: _TokenType_name[405:426],
|
||||
2021: _TokenType_name[426:443],
|
||||
2022: _TokenType_name[443:461],
|
||||
2023: _TokenType_name[461:481],
|
||||
2024: _TokenType_name[481:498],
|
||||
3000: _TokenType_name[498:505],
|
||||
3001: _TokenType_name[505:516],
|
||||
3002: _TokenType_name[516:528],
|
||||
3100: _TokenType_name[528:541],
|
||||
3101: _TokenType_name[541:559],
|
||||
3102: _TokenType_name[559:576],
|
||||
3103: _TokenType_name[576:597],
|
||||
3104: _TokenType_name[597:617],
|
||||
3105: _TokenType_name[617:634],
|
||||
3106: _TokenType_name[634:656],
|
||||
3107: _TokenType_name[656:672],
|
||||
3108: _TokenType_name[672:691],
|
||||
3109: _TokenType_name[691:710],
|
||||
3110: _TokenType_name[710:730],
|
||||
3111: _TokenType_name[730:751],
|
||||
3112: _TokenType_name[751:768],
|
||||
3113: _TokenType_name[768:786],
|
||||
3114: _TokenType_name[786:804],
|
||||
3115: _TokenType_name[804:823],
|
||||
3116: _TokenType_name[823:842],
|
||||
3200: _TokenType_name[842:855],
|
||||
3201: _TokenType_name[855:871],
|
||||
3202: _TokenType_name[871:889],
|
||||
3203: _TokenType_name[889:905],
|
||||
3204: _TokenType_name[905:925],
|
||||
3205: _TokenType_name[925:949],
|
||||
3206: _TokenType_name[949:965],
|
||||
4000: _TokenType_name[965:973],
|
||||
4001: _TokenType_name[973:985],
|
||||
5000: _TokenType_name[985:996],
|
||||
6000: _TokenType_name[996:1003],
|
||||
6001: _TokenType_name[1003:1018],
|
||||
6002: _TokenType_name[1018:1034],
|
||||
6003: _TokenType_name[1034:1047],
|
||||
6004: _TokenType_name[1047:1061],
|
||||
6100: _TokenType_name[1061:1075],
|
||||
6101: _TokenType_name[1075:1093],
|
||||
7000: _TokenType_name[1093:1100],
|
||||
7001: _TokenType_name[1100:1114],
|
||||
7002: _TokenType_name[1114:1125],
|
||||
7003: _TokenType_name[1125:1137],
|
||||
7004: _TokenType_name[1137:1151],
|
||||
7005: _TokenType_name[1151:1166],
|
||||
7006: _TokenType_name[1166:1179],
|
||||
7007: _TokenType_name[1179:1192],
|
||||
7008: _TokenType_name[1192:1205],
|
||||
7009: _TokenType_name[1205:1222],
|
||||
7010: _TokenType_name[1222:1238],
|
||||
7011: _TokenType_name[1238:1254],
|
||||
8000: _TokenType_name[1254:1258],
|
||||
8001: _TokenType_name[1258:1272],
|
||||
8002: _TokenType_name[1272:1282],
|
||||
8003: _TokenType_name[1282:1297],
|
||||
0: _TokenType_name[84:91],
|
||||
1000: _TokenType_name[91:98],
|
||||
1001: _TokenType_name[98:113],
|
||||
1002: _TokenType_name[113:131],
|
||||
1003: _TokenType_name[131:147],
|
||||
1004: _TokenType_name[147:160],
|
||||
1005: _TokenType_name[160:175],
|
||||
1006: _TokenType_name[175:186],
|
||||
2000: _TokenType_name[186:190],
|
||||
2001: _TokenType_name[190:203],
|
||||
2002: _TokenType_name[203:214],
|
||||
2003: _TokenType_name[214:231],
|
||||
2004: _TokenType_name[231:240],
|
||||
2005: _TokenType_name[240:252],
|
||||
2006: _TokenType_name[252:265],
|
||||
2007: _TokenType_name[265:275],
|
||||
2008: _TokenType_name[275:288],
|
||||
2009: _TokenType_name[288:300],
|
||||
2010: _TokenType_name[300:317],
|
||||
2011: _TokenType_name[317:328],
|
||||
2012: _TokenType_name[328:337],
|
||||
2013: _TokenType_name[337:350],
|
||||
2014: _TokenType_name[350:362],
|
||||
2015: _TokenType_name[362:371],
|
||||
2016: _TokenType_name[371:381],
|
||||
2017: _TokenType_name[381:393],
|
||||
2018: _TokenType_name[393:400],
|
||||
2019: _TokenType_name[400:412],
|
||||
2020: _TokenType_name[412:433],
|
||||
2021: _TokenType_name[433:450],
|
||||
2022: _TokenType_name[450:468],
|
||||
2023: _TokenType_name[468:488],
|
||||
2024: _TokenType_name[488:505],
|
||||
3000: _TokenType_name[505:512],
|
||||
3001: _TokenType_name[512:523],
|
||||
3002: _TokenType_name[523:535],
|
||||
3100: _TokenType_name[535:548],
|
||||
3101: _TokenType_name[548:566],
|
||||
3102: _TokenType_name[566:583],
|
||||
3103: _TokenType_name[583:604],
|
||||
3104: _TokenType_name[604:624],
|
||||
3105: _TokenType_name[624:641],
|
||||
3106: _TokenType_name[641:663],
|
||||
3107: _TokenType_name[663:679],
|
||||
3108: _TokenType_name[679:698],
|
||||
3109: _TokenType_name[698:717],
|
||||
3110: _TokenType_name[717:737],
|
||||
3111: _TokenType_name[737:758],
|
||||
3112: _TokenType_name[758:775],
|
||||
3113: _TokenType_name[775:793],
|
||||
3114: _TokenType_name[793:811],
|
||||
3115: _TokenType_name[811:830],
|
||||
3116: _TokenType_name[830:849],
|
||||
3200: _TokenType_name[849:862],
|
||||
3201: _TokenType_name[862:878],
|
||||
3202: _TokenType_name[878:896],
|
||||
3203: _TokenType_name[896:912],
|
||||
3204: _TokenType_name[912:932],
|
||||
3205: _TokenType_name[932:956],
|
||||
3206: _TokenType_name[956:972],
|
||||
4000: _TokenType_name[972:980],
|
||||
4001: _TokenType_name[980:992],
|
||||
5000: _TokenType_name[992:1003],
|
||||
6000: _TokenType_name[1003:1010],
|
||||
6001: _TokenType_name[1010:1025],
|
||||
6002: _TokenType_name[1025:1041],
|
||||
6003: _TokenType_name[1041:1054],
|
||||
6004: _TokenType_name[1054:1068],
|
||||
6100: _TokenType_name[1068:1082],
|
||||
6101: _TokenType_name[1082:1100],
|
||||
7000: _TokenType_name[1100:1107],
|
||||
7001: _TokenType_name[1107:1121],
|
||||
7002: _TokenType_name[1121:1132],
|
||||
7003: _TokenType_name[1132:1144],
|
||||
7004: _TokenType_name[1144:1158],
|
||||
7005: _TokenType_name[1158:1173],
|
||||
7006: _TokenType_name[1173:1186],
|
||||
7007: _TokenType_name[1186:1199],
|
||||
7008: _TokenType_name[1199:1212],
|
||||
7009: _TokenType_name[1212:1229],
|
||||
7010: _TokenType_name[1229:1245],
|
||||
7011: _TokenType_name[1245:1261],
|
||||
8000: _TokenType_name[1261:1265],
|
||||
8001: _TokenType_name[1265:1279],
|
||||
8002: _TokenType_name[1279:1289],
|
||||
8003: _TokenType_name[1289:1304],
|
||||
}
|
||||
|
||||
func (i TokenType) String() string {
|
||||
if str, ok := _TokenType_map[i]; ok {
|
||||
return str
|
||||
}
|
||||
return fmt.Sprintf("TokenType(%d)", i)
|
||||
return "TokenType(" + strconv.FormatInt(int64(i), 10) + ")"
|
||||
}
|
||||
|
7
vendor/github.com/alecthomas/chroma/types.go
generated
vendored
7
vendor/github.com/alecthomas/chroma/types.go
generated
vendored
@@ -12,7 +12,7 @@ import (
|
||||
// It is also an Emitter, emitting a single token of itself
|
||||
type TokenType int
|
||||
|
||||
func (t *TokenType) MarshalJSON() ([]byte, error) { return json.Marshal(t.String()) }
|
||||
func (t TokenType) MarshalJSON() ([]byte, error) { return json.Marshal(t.String()) }
|
||||
func (t *TokenType) UnmarshalJSON(data []byte) error {
|
||||
key := ""
|
||||
err := json.Unmarshal(data, &key)
|
||||
@@ -54,6 +54,8 @@ const (
|
||||
Other
|
||||
// No highlighting.
|
||||
None
|
||||
// Used as an EOF marker / nil token
|
||||
EOFType TokenType = 0
|
||||
)
|
||||
|
||||
// Keywords.
|
||||
@@ -310,6 +312,7 @@ var (
|
||||
GenericStrong: "gs",
|
||||
GenericSubheading: "gu",
|
||||
GenericTraceback: "gt",
|
||||
GenericUnderline: "gl",
|
||||
}
|
||||
)
|
||||
|
||||
@@ -340,5 +343,5 @@ func (t TokenType) InSubCategory(other TokenType) bool {
|
||||
}
|
||||
|
||||
func (t TokenType) Emit(groups []string, lexer Lexer) Iterator {
|
||||
return Literator(&Token{Type: t, Value: groups[0]})
|
||||
return Literator(Token{Type: t, Value: groups[0]})
|
||||
}
|
||||
|
1
vendor/github.com/rs/zerolog/CNAME
generated
vendored
Normal file
1
vendor/github.com/rs/zerolog/CNAME
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
zerolog.io
|
1
vendor/github.com/rs/zerolog/_config.yml
generated
vendored
Normal file
1
vendor/github.com/rs/zerolog/_config.yml
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
remote_theme: rs/gh-readme
|
56
vendor/github.com/rs/zerolog/encoder.go
generated
vendored
Normal file
56
vendor/github.com/rs/zerolog/encoder.go
generated
vendored
Normal file
@@ -0,0 +1,56 @@
|
||||
package zerolog
|
||||
|
||||
import (
|
||||
"net"
|
||||
"time"
|
||||
)
|
||||
|
||||
type encoder interface {
|
||||
AppendArrayDelim(dst []byte) []byte
|
||||
AppendArrayEnd(dst []byte) []byte
|
||||
AppendArrayStart(dst []byte) []byte
|
||||
AppendBeginMarker(dst []byte) []byte
|
||||
AppendBool(dst []byte, val bool) []byte
|
||||
AppendBools(dst []byte, vals []bool) []byte
|
||||
AppendBytes(dst, s []byte) []byte
|
||||
AppendDuration(dst []byte, d time.Duration, unit time.Duration, useInt bool) []byte
|
||||
AppendDurations(dst []byte, vals []time.Duration, unit time.Duration, useInt bool) []byte
|
||||
AppendEndMarker(dst []byte) []byte
|
||||
AppendFloat32(dst []byte, val float32) []byte
|
||||
AppendFloat64(dst []byte, val float64) []byte
|
||||
AppendFloats32(dst []byte, vals []float32) []byte
|
||||
AppendFloats64(dst []byte, vals []float64) []byte
|
||||
AppendHex(dst, s []byte) []byte
|
||||
AppendIPAddr(dst []byte, ip net.IP) []byte
|
||||
AppendIPPrefix(dst []byte, pfx net.IPNet) []byte
|
||||
AppendInt(dst []byte, val int) []byte
|
||||
AppendInt16(dst []byte, val int16) []byte
|
||||
AppendInt32(dst []byte, val int32) []byte
|
||||
AppendInt64(dst []byte, val int64) []byte
|
||||
AppendInt8(dst []byte, val int8) []byte
|
||||
AppendInterface(dst []byte, i interface{}) []byte
|
||||
AppendInts(dst []byte, vals []int) []byte
|
||||
AppendInts16(dst []byte, vals []int16) []byte
|
||||
AppendInts32(dst []byte, vals []int32) []byte
|
||||
AppendInts64(dst []byte, vals []int64) []byte
|
||||
AppendInts8(dst []byte, vals []int8) []byte
|
||||
AppendKey(dst []byte, key string) []byte
|
||||
AppendLineBreak(dst []byte) []byte
|
||||
AppendMACAddr(dst []byte, ha net.HardwareAddr) []byte
|
||||
AppendNil(dst []byte) []byte
|
||||
AppendObjectData(dst []byte, o []byte) []byte
|
||||
AppendString(dst []byte, s string) []byte
|
||||
AppendStrings(dst []byte, vals []string) []byte
|
||||
AppendTime(dst []byte, t time.Time, format string) []byte
|
||||
AppendTimes(dst []byte, vals []time.Time, format string) []byte
|
||||
AppendUint(dst []byte, val uint) []byte
|
||||
AppendUint16(dst []byte, val uint16) []byte
|
||||
AppendUint32(dst []byte, val uint32) []byte
|
||||
AppendUint64(dst []byte, val uint64) []byte
|
||||
AppendUint8(dst []byte, val uint8) []byte
|
||||
AppendUints(dst []byte, vals []uint) []byte
|
||||
AppendUints16(dst []byte, vals []uint16) []byte
|
||||
AppendUints32(dst []byte, vals []uint32) []byte
|
||||
AppendUints64(dst []byte, vals []uint64) []byte
|
||||
AppendUints8(dst []byte, vals []uint8) []byte
|
||||
}
|
1
vendor/github.com/rs/zerolog/go.mod
generated
vendored
Normal file
1
vendor/github.com/rs/zerolog/go.mod
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
module github.com/rs/zerolog
|
7
vendor/github.com/rs/zerolog/go112.go
generated
vendored
Normal file
7
vendor/github.com/rs/zerolog/go112.go
generated
vendored
Normal file
@@ -0,0 +1,7 @@
|
||||
// +build go1.12
|
||||
|
||||
package zerolog
|
||||
|
||||
// Since go 1.12, some auto generated init functions are hidden from
|
||||
// runtime.Caller.
|
||||
const contextCallerSkipFrameCount = 2
|
614
vendor/github.com/rs/zerolog/internal/cbor/decode_stream.go
generated
vendored
Normal file
614
vendor/github.com/rs/zerolog/internal/cbor/decode_stream.go
generated
vendored
Normal file
@@ -0,0 +1,614 @@
|
||||
package cbor
|
||||
|
||||
// This file contains code to decode a stream of CBOR Data into JSON.
|
||||
|
||||
import (
|
||||
"bufio"
|
||||
"bytes"
|
||||
"fmt"
|
||||
"io"
|
||||
"math"
|
||||
"net"
|
||||
"runtime"
|
||||
"strconv"
|
||||
"strings"
|
||||
"time"
|
||||
"unicode/utf8"
|
||||
)
|
||||
|
||||
var decodeTimeZone *time.Location
|
||||
|
||||
const hexTable = "0123456789abcdef"
|
||||
|
||||
const isFloat32 = 4
|
||||
const isFloat64 = 8
|
||||
|
||||
func readNBytes(src *bufio.Reader, n int) []byte {
|
||||
ret := make([]byte, n)
|
||||
for i := 0; i < n; i++ {
|
||||
ch, e := src.ReadByte()
|
||||
if e != nil {
|
||||
panic(fmt.Errorf("Tried to Read %d Bytes.. But hit end of file", n))
|
||||
}
|
||||
ret[i] = ch
|
||||
}
|
||||
return ret
|
||||
}
|
||||
|
||||
func readByte(src *bufio.Reader) byte {
|
||||
b, e := src.ReadByte()
|
||||
if e != nil {
|
||||
panic(fmt.Errorf("Tried to Read 1 Byte.. But hit end of file"))
|
||||
}
|
||||
return b
|
||||
}
|
||||
|
||||
func decodeIntAdditonalType(src *bufio.Reader, minor byte) int64 {
|
||||
val := int64(0)
|
||||
if minor <= 23 {
|
||||
val = int64(minor)
|
||||
} else {
|
||||
bytesToRead := 0
|
||||
switch minor {
|
||||
case additionalTypeIntUint8:
|
||||
bytesToRead = 1
|
||||
case additionalTypeIntUint16:
|
||||
bytesToRead = 2
|
||||
case additionalTypeIntUint32:
|
||||
bytesToRead = 4
|
||||
case additionalTypeIntUint64:
|
||||
bytesToRead = 8
|
||||
default:
|
||||
panic(fmt.Errorf("Invalid Additional Type: %d in decodeInteger (expected <28)", minor))
|
||||
}
|
||||
pb := readNBytes(src, bytesToRead)
|
||||
for i := 0; i < bytesToRead; i++ {
|
||||
val = val * 256
|
||||
val += int64(pb[i])
|
||||
}
|
||||
}
|
||||
return val
|
||||
}
|
||||
|
||||
func decodeInteger(src *bufio.Reader) int64 {
|
||||
pb := readByte(src)
|
||||
major := pb & maskOutAdditionalType
|
||||
minor := pb & maskOutMajorType
|
||||
if major != majorTypeUnsignedInt && major != majorTypeNegativeInt {
|
||||
panic(fmt.Errorf("Major type is: %d in decodeInteger!! (expected 0 or 1)", major))
|
||||
}
|
||||
val := decodeIntAdditonalType(src, minor)
|
||||
if major == 0 {
|
||||
return val
|
||||
}
|
||||
return (-1 - val)
|
||||
}
|
||||
|
||||
func decodeFloat(src *bufio.Reader) (float64, int) {
|
||||
pb := readByte(src)
|
||||
major := pb & maskOutAdditionalType
|
||||
minor := pb & maskOutMajorType
|
||||
if major != majorTypeSimpleAndFloat {
|
||||
panic(fmt.Errorf("Incorrect Major type is: %d in decodeFloat", major))
|
||||
}
|
||||
|
||||
switch minor {
|
||||
case additionalTypeFloat16:
|
||||
panic(fmt.Errorf("float16 is not suppported in decodeFloat"))
|
||||
|
||||
case additionalTypeFloat32:
|
||||
pb := readNBytes(src, 4)
|
||||
switch string(pb) {
|
||||
case float32Nan:
|
||||
return math.NaN(), isFloat32
|
||||
case float32PosInfinity:
|
||||
return math.Inf(0), isFloat32
|
||||
case float32NegInfinity:
|
||||
return math.Inf(-1), isFloat32
|
||||
}
|
||||
n := uint32(0)
|
||||
for i := 0; i < 4; i++ {
|
||||
n = n * 256
|
||||
n += uint32(pb[i])
|
||||
}
|
||||
val := math.Float32frombits(n)
|
||||
return float64(val), isFloat32
|
||||
case additionalTypeFloat64:
|
||||
pb := readNBytes(src, 8)
|
||||
switch string(pb) {
|
||||
case float64Nan:
|
||||
return math.NaN(), isFloat64
|
||||
case float64PosInfinity:
|
||||
return math.Inf(0), isFloat64
|
||||
case float64NegInfinity:
|
||||
return math.Inf(-1), isFloat64
|
||||
}
|
||||
n := uint64(0)
|
||||
for i := 0; i < 8; i++ {
|
||||
n = n * 256
|
||||
n += uint64(pb[i])
|
||||
}
|
||||
val := math.Float64frombits(n)
|
||||
return val, isFloat64
|
||||
}
|
||||
panic(fmt.Errorf("Invalid Additional Type: %d in decodeFloat", minor))
|
||||
}
|
||||
|
||||
func decodeStringComplex(dst []byte, s string, pos uint) []byte {
|
||||
i := int(pos)
|
||||
start := 0
|
||||
|
||||
for i < len(s) {
|
||||
b := s[i]
|
||||
if b >= utf8.RuneSelf {
|
||||
r, size := utf8.DecodeRuneInString(s[i:])
|
||||
if r == utf8.RuneError && size == 1 {
|
||||
// In case of error, first append previous simple characters to
|
||||
// the byte slice if any and append a replacement character code
|
||||
// in place of the invalid sequence.
|
||||
if start < i {
|
||||
dst = append(dst, s[start:i]...)
|
||||
}
|
||||
dst = append(dst, `\ufffd`...)
|
||||
i += size
|
||||
start = i
|
||||
continue
|
||||
}
|
||||
i += size
|
||||
continue
|
||||
}
|
||||
if b >= 0x20 && b <= 0x7e && b != '\\' && b != '"' {
|
||||
i++
|
||||
continue
|
||||
}
|
||||
// We encountered a character that needs to be encoded.
|
||||
// Let's append the previous simple characters to the byte slice
|
||||
// and switch our operation to read and encode the remainder
|
||||
// characters byte-by-byte.
|
||||
if start < i {
|
||||
dst = append(dst, s[start:i]...)
|
||||
}
|
||||
switch b {
|
||||
case '"', '\\':
|
||||
dst = append(dst, '\\', b)
|
||||
case '\b':
|
||||
dst = append(dst, '\\', 'b')
|
||||
case '\f':
|
||||
dst = append(dst, '\\', 'f')
|
||||
case '\n':
|
||||
dst = append(dst, '\\', 'n')
|
||||
case '\r':
|
||||
dst = append(dst, '\\', 'r')
|
||||
case '\t':
|
||||
dst = append(dst, '\\', 't')
|
||||
default:
|
||||
dst = append(dst, '\\', 'u', '0', '0', hexTable[b>>4], hexTable[b&0xF])
|
||||
}
|
||||
i++
|
||||
start = i
|
||||
}
|
||||
if start < len(s) {
|
||||
dst = append(dst, s[start:]...)
|
||||
}
|
||||
return dst
|
||||
}
|
||||
|
||||
func decodeString(src *bufio.Reader, noQuotes bool) []byte {
|
||||
pb := readByte(src)
|
||||
major := pb & maskOutAdditionalType
|
||||
minor := pb & maskOutMajorType
|
||||
if major != majorTypeByteString {
|
||||
panic(fmt.Errorf("Major type is: %d in decodeString", major))
|
||||
}
|
||||
result := []byte{}
|
||||
if !noQuotes {
|
||||
result = append(result, '"')
|
||||
}
|
||||
length := decodeIntAdditonalType(src, minor)
|
||||
len := int(length)
|
||||
pbs := readNBytes(src, len)
|
||||
result = append(result, pbs...)
|
||||
if noQuotes {
|
||||
return result
|
||||
}
|
||||
return append(result, '"')
|
||||
}
|
||||
|
||||
func decodeUTF8String(src *bufio.Reader) []byte {
|
||||
pb := readByte(src)
|
||||
major := pb & maskOutAdditionalType
|
||||
minor := pb & maskOutMajorType
|
||||
if major != majorTypeUtf8String {
|
||||
panic(fmt.Errorf("Major type is: %d in decodeUTF8String", major))
|
||||
}
|
||||
result := []byte{'"'}
|
||||
length := decodeIntAdditonalType(src, minor)
|
||||
len := int(length)
|
||||
pbs := readNBytes(src, len)
|
||||
|
||||
for i := 0; i < len; i++ {
|
||||
// Check if the character needs encoding. Control characters, slashes,
|
||||
// and the double quote need json encoding. Bytes above the ascii
|
||||
// boundary needs utf8 encoding.
|
||||
if pbs[i] < 0x20 || pbs[i] > 0x7e || pbs[i] == '\\' || pbs[i] == '"' {
|
||||
// We encountered a character that needs to be encoded. Switch
|
||||
// to complex version of the algorithm.
|
||||
dst := []byte{'"'}
|
||||
dst = decodeStringComplex(dst, string(pbs), uint(i))
|
||||
return append(dst, '"')
|
||||
}
|
||||
}
|
||||
// The string has no need for encoding an therefore is directly
|
||||
// appended to the byte slice.
|
||||
result = append(result, pbs...)
|
||||
return append(result, '"')
|
||||
}
|
||||
|
||||
func array2Json(src *bufio.Reader, dst io.Writer) {
|
||||
dst.Write([]byte{'['})
|
||||
pb := readByte(src)
|
||||
major := pb & maskOutAdditionalType
|
||||
minor := pb & maskOutMajorType
|
||||
if major != majorTypeArray {
|
||||
panic(fmt.Errorf("Major type is: %d in array2Json", major))
|
||||
}
|
||||
len := 0
|
||||
unSpecifiedCount := false
|
||||
if minor == additionalTypeInfiniteCount {
|
||||
unSpecifiedCount = true
|
||||
} else {
|
||||
length := decodeIntAdditonalType(src, minor)
|
||||
len = int(length)
|
||||
}
|
||||
for i := 0; unSpecifiedCount || i < len; i++ {
|
||||
if unSpecifiedCount {
|
||||
pb, e := src.Peek(1)
|
||||
if e != nil {
|
||||
panic(e)
|
||||
}
|
||||
if pb[0] == byte(majorTypeSimpleAndFloat|additionalTypeBreak) {
|
||||
readByte(src)
|
||||
break
|
||||
}
|
||||
}
|
||||
cbor2JsonOneObject(src, dst)
|
||||
if unSpecifiedCount {
|
||||
pb, e := src.Peek(1)
|
||||
if e != nil {
|
||||
panic(e)
|
||||
}
|
||||
if pb[0] == byte(majorTypeSimpleAndFloat|additionalTypeBreak) {
|
||||
readByte(src)
|
||||
break
|
||||
}
|
||||
dst.Write([]byte{','})
|
||||
} else if i+1 < len {
|
||||
dst.Write([]byte{','})
|
||||
}
|
||||
}
|
||||
dst.Write([]byte{']'})
|
||||
}
|
||||
|
||||
func map2Json(src *bufio.Reader, dst io.Writer) {
|
||||
pb := readByte(src)
|
||||
major := pb & maskOutAdditionalType
|
||||
minor := pb & maskOutMajorType
|
||||
if major != majorTypeMap {
|
||||
panic(fmt.Errorf("Major type is: %d in map2Json", major))
|
||||
}
|
||||
len := 0
|
||||
unSpecifiedCount := false
|
||||
if minor == additionalTypeInfiniteCount {
|
||||
unSpecifiedCount = true
|
||||
} else {
|
||||
length := decodeIntAdditonalType(src, minor)
|
||||
len = int(length)
|
||||
}
|
||||
dst.Write([]byte{'{'})
|
||||
for i := 0; unSpecifiedCount || i < len; i++ {
|
||||
if unSpecifiedCount {
|
||||
pb, e := src.Peek(1)
|
||||
if e != nil {
|
||||
panic(e)
|
||||
}
|
||||
if pb[0] == byte(majorTypeSimpleAndFloat|additionalTypeBreak) {
|
||||
readByte(src)
|
||||
break
|
||||
}
|
||||
}
|
||||
cbor2JsonOneObject(src, dst)
|
||||
if i%2 == 0 {
|
||||
// Even position values are keys.
|
||||
dst.Write([]byte{':'})
|
||||
} else {
|
||||
if unSpecifiedCount {
|
||||
pb, e := src.Peek(1)
|
||||
if e != nil {
|
||||
panic(e)
|
||||
}
|
||||
if pb[0] == byte(majorTypeSimpleAndFloat|additionalTypeBreak) {
|
||||
readByte(src)
|
||||
break
|
||||
}
|
||||
dst.Write([]byte{','})
|
||||
} else if i+1 < len {
|
||||
dst.Write([]byte{','})
|
||||
}
|
||||
}
|
||||
}
|
||||
dst.Write([]byte{'}'})
|
||||
}
|
||||
|
||||
func decodeTagData(src *bufio.Reader) []byte {
|
||||
pb := readByte(src)
|
||||
major := pb & maskOutAdditionalType
|
||||
minor := pb & maskOutMajorType
|
||||
if major != majorTypeTags {
|
||||
panic(fmt.Errorf("Major type is: %d in decodeTagData", major))
|
||||
}
|
||||
switch minor {
|
||||
case additionalTypeTimestamp:
|
||||
return decodeTimeStamp(src)
|
||||
|
||||
// Tag value is larger than 256 (so uint16).
|
||||
case additionalTypeIntUint16:
|
||||
val := decodeIntAdditonalType(src, minor)
|
||||
|
||||
switch uint16(val) {
|
||||
case additionalTypeEmbeddedJSON:
|
||||
pb := readByte(src)
|
||||
dataMajor := pb & maskOutAdditionalType
|
||||
if dataMajor != majorTypeByteString {
|
||||
panic(fmt.Errorf("Unsupported embedded Type: %d in decodeEmbeddedJSON", dataMajor))
|
||||
}
|
||||
src.UnreadByte()
|
||||
return decodeString(src, true)
|
||||
|
||||
case additionalTypeTagNetworkAddr:
|
||||
octets := decodeString(src, true)
|
||||
ss := []byte{'"'}
|
||||
switch len(octets) {
|
||||
case 6: // MAC address.
|
||||
ha := net.HardwareAddr(octets)
|
||||
ss = append(append(ss, ha.String()...), '"')
|
||||
case 4: // IPv4 address.
|
||||
fallthrough
|
||||
case 16: // IPv6 address.
|
||||
ip := net.IP(octets)
|
||||
ss = append(append(ss, ip.String()...), '"')
|
||||
default:
|
||||
panic(fmt.Errorf("Unexpected Network Address length: %d (expected 4,6,16)", len(octets)))
|
||||
}
|
||||
return ss
|
||||
|
||||
case additionalTypeTagNetworkPrefix:
|
||||
pb := readByte(src)
|
||||
if pb != byte(majorTypeMap|0x1) {
|
||||
panic(fmt.Errorf("IP Prefix is NOT of MAP of 1 elements as expected"))
|
||||
}
|
||||
octets := decodeString(src, true)
|
||||
val := decodeInteger(src)
|
||||
ip := net.IP(octets)
|
||||
var mask net.IPMask
|
||||
pfxLen := int(val)
|
||||
if len(octets) == 4 {
|
||||
mask = net.CIDRMask(pfxLen, 32)
|
||||
} else {
|
||||
mask = net.CIDRMask(pfxLen, 128)
|
||||
}
|
||||
ipPfx := net.IPNet{IP: ip, Mask: mask}
|
||||
ss := []byte{'"'}
|
||||
ss = append(append(ss, ipPfx.String()...), '"')
|
||||
return ss
|
||||
|
||||
case additionalTypeTagHexString:
|
||||
octets := decodeString(src, true)
|
||||
ss := []byte{'"'}
|
||||
for _, v := range octets {
|
||||
ss = append(ss, hexTable[v>>4], hexTable[v&0x0f])
|
||||
}
|
||||
return append(ss, '"')
|
||||
|
||||
default:
|
||||
panic(fmt.Errorf("Unsupported Additional Tag Type: %d in decodeTagData", val))
|
||||
}
|
||||
}
|
||||
panic(fmt.Errorf("Unsupported Additional Type: %d in decodeTagData", minor))
|
||||
}
|
||||
|
||||
func decodeTimeStamp(src *bufio.Reader) []byte {
|
||||
pb := readByte(src)
|
||||
src.UnreadByte()
|
||||
tsMajor := pb & maskOutAdditionalType
|
||||
if tsMajor == majorTypeUnsignedInt || tsMajor == majorTypeNegativeInt {
|
||||
n := decodeInteger(src)
|
||||
t := time.Unix(n, 0)
|
||||
if decodeTimeZone != nil {
|
||||
t = t.In(decodeTimeZone)
|
||||
} else {
|
||||
t = t.In(time.UTC)
|
||||
}
|
||||
tsb := []byte{}
|
||||
tsb = append(tsb, '"')
|
||||
tsb = t.AppendFormat(tsb, IntegerTimeFieldFormat)
|
||||
tsb = append(tsb, '"')
|
||||
return tsb
|
||||
} else if tsMajor == majorTypeSimpleAndFloat {
|
||||
n, _ := decodeFloat(src)
|
||||
secs := int64(n)
|
||||
n -= float64(secs)
|
||||
n *= float64(1e9)
|
||||
t := time.Unix(secs, int64(n))
|
||||
if decodeTimeZone != nil {
|
||||
t = t.In(decodeTimeZone)
|
||||
} else {
|
||||
t = t.In(time.UTC)
|
||||
}
|
||||
tsb := []byte{}
|
||||
tsb = append(tsb, '"')
|
||||
tsb = t.AppendFormat(tsb, NanoTimeFieldFormat)
|
||||
tsb = append(tsb, '"')
|
||||
return tsb
|
||||
}
|
||||
panic(fmt.Errorf("TS format is neigther int nor float: %d", tsMajor))
|
||||
}
|
||||
|
||||
func decodeSimpleFloat(src *bufio.Reader) []byte {
|
||||
pb := readByte(src)
|
||||
major := pb & maskOutAdditionalType
|
||||
minor := pb & maskOutMajorType
|
||||
if major != majorTypeSimpleAndFloat {
|
||||
panic(fmt.Errorf("Major type is: %d in decodeSimpleFloat", major))
|
||||
}
|
||||
switch minor {
|
||||
case additionalTypeBoolTrue:
|
||||
return []byte("true")
|
||||
case additionalTypeBoolFalse:
|
||||
return []byte("false")
|
||||
case additionalTypeNull:
|
||||
return []byte("null")
|
||||
case additionalTypeFloat16:
|
||||
fallthrough
|
||||
case additionalTypeFloat32:
|
||||
fallthrough
|
||||
case additionalTypeFloat64:
|
||||
src.UnreadByte()
|
||||
v, bc := decodeFloat(src)
|
||||
ba := []byte{}
|
||||
switch {
|
||||
case math.IsNaN(v):
|
||||
return []byte("\"NaN\"")
|
||||
case math.IsInf(v, 1):
|
||||
return []byte("\"+Inf\"")
|
||||
case math.IsInf(v, -1):
|
||||
return []byte("\"-Inf\"")
|
||||
}
|
||||
if bc == isFloat32 {
|
||||
ba = strconv.AppendFloat(ba, v, 'f', -1, 32)
|
||||
} else if bc == isFloat64 {
|
||||
ba = strconv.AppendFloat(ba, v, 'f', -1, 64)
|
||||
} else {
|
||||
panic(fmt.Errorf("Invalid Float precision from decodeFloat: %d", bc))
|
||||
}
|
||||
return ba
|
||||
default:
|
||||
panic(fmt.Errorf("Invalid Additional Type: %d in decodeSimpleFloat", minor))
|
||||
}
|
||||
}
|
||||
|
||||
func cbor2JsonOneObject(src *bufio.Reader, dst io.Writer) {
|
||||
pb, e := src.Peek(1)
|
||||
if e != nil {
|
||||
panic(e)
|
||||
}
|
||||
major := (pb[0] & maskOutAdditionalType)
|
||||
|
||||
switch major {
|
||||
case majorTypeUnsignedInt:
|
||||
fallthrough
|
||||
case majorTypeNegativeInt:
|
||||
n := decodeInteger(src)
|
||||
dst.Write([]byte(strconv.Itoa(int(n))))
|
||||
|
||||
case majorTypeByteString:
|
||||
s := decodeString(src, false)
|
||||
dst.Write(s)
|
||||
|
||||
case majorTypeUtf8String:
|
||||
s := decodeUTF8String(src)
|
||||
dst.Write(s)
|
||||
|
||||
case majorTypeArray:
|
||||
array2Json(src, dst)
|
||||
|
||||
case majorTypeMap:
|
||||
map2Json(src, dst)
|
||||
|
||||
case majorTypeTags:
|
||||
s := decodeTagData(src)
|
||||
dst.Write(s)
|
||||
|
||||
case majorTypeSimpleAndFloat:
|
||||
s := decodeSimpleFloat(src)
|
||||
dst.Write(s)
|
||||
}
|
||||
}
|
||||
|
||||
func moreBytesToRead(src *bufio.Reader) bool {
|
||||
_, e := src.ReadByte()
|
||||
if e == nil {
|
||||
src.UnreadByte()
|
||||
return true
|
||||
}
|
||||
return false
|
||||
}
|
||||
|
||||
// Cbor2JsonManyObjects decodes all the CBOR Objects read from src
|
||||
// reader. It keeps on decoding until reader returns EOF (error when reading).
|
||||
// Decoded string is written to the dst. At the end of every CBOR Object
|
||||
// newline is written to the output stream.
|
||||
//
|
||||
// Returns error (if any) that was encountered during decode.
|
||||
// The child functions will generate a panic when error is encountered and
|
||||
// this function will recover non-runtime Errors and return the reason as error.
|
||||
func Cbor2JsonManyObjects(src io.Reader, dst io.Writer) (err error) {
|
||||
defer func() {
|
||||
if r := recover(); r != nil {
|
||||
if _, ok := r.(runtime.Error); ok {
|
||||
panic(r)
|
||||
}
|
||||
err = r.(error)
|
||||
}
|
||||
}()
|
||||
bufRdr := bufio.NewReader(src)
|
||||
for moreBytesToRead(bufRdr) {
|
||||
cbor2JsonOneObject(bufRdr, dst)
|
||||
dst.Write([]byte("\n"))
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
// Detect if the bytes to be printed is Binary or not.
|
||||
func binaryFmt(p []byte) bool {
|
||||
if len(p) > 0 && p[0] > 0x7F {
|
||||
return true
|
||||
}
|
||||
return false
|
||||
}
|
||||
|
||||
func getReader(str string) *bufio.Reader {
|
||||
return bufio.NewReader(strings.NewReader(str))
|
||||
}
|
||||
|
||||
// DecodeIfBinaryToString converts a binary formatted log msg to a
|
||||
// JSON formatted String Log message - suitable for printing to Console/Syslog.
|
||||
func DecodeIfBinaryToString(in []byte) string {
|
||||
if binaryFmt(in) {
|
||||
var b bytes.Buffer
|
||||
Cbor2JsonManyObjects(strings.NewReader(string(in)), &b)
|
||||
return b.String()
|
||||
}
|
||||
return string(in)
|
||||
}
|
||||
|
||||
// DecodeObjectToStr checks if the input is a binary format, if so,
|
||||
// it will decode a single Object and return the decoded string.
|
||||
func DecodeObjectToStr(in []byte) string {
|
||||
if binaryFmt(in) {
|
||||
var b bytes.Buffer
|
||||
cbor2JsonOneObject(getReader(string(in)), &b)
|
||||
return b.String()
|
||||
}
|
||||
return string(in)
|
||||
}
|
||||
|
||||
// DecodeIfBinaryToBytes checks if the input is a binary format, if so,
|
||||
// it will decode all Objects and return the decoded string as byte array.
|
||||
func DecodeIfBinaryToBytes(in []byte) []byte {
|
||||
if binaryFmt(in) {
|
||||
var b bytes.Buffer
|
||||
Cbor2JsonManyObjects(bytes.NewReader(in), &b)
|
||||
return b.Bytes()
|
||||
}
|
||||
return in
|
||||
}
|
5
vendor/github.com/rs/zerolog/not_go112.go
generated
vendored
Normal file
5
vendor/github.com/rs/zerolog/not_go112.go
generated
vendored
Normal file
@@ -0,0 +1,5 @@
|
||||
// +build !go1.12
|
||||
|
||||
package zerolog
|
||||
|
||||
const contextCallerSkipFrameCount = 3
|
1
vendor/gitlab.com/pztrn/flagger/.gitignore
generated
vendored
Normal file
1
vendor/gitlab.com/pztrn/flagger/.gitignore
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
*DS_Store*
|
34
vendor/gitlab.com/pztrn/flagger/.gitlab-ci.yml
generated
vendored
Normal file
34
vendor/gitlab.com/pztrn/flagger/.gitlab-ci.yml
generated
vendored
Normal file
@@ -0,0 +1,34 @@
|
||||
stages:
|
||||
- test
|
||||
|
||||
before_script:
|
||||
- mkdir -p ${GOPATH}/src/gitlab.com/pztrn/
|
||||
- ln -s $CI_PROJECT_DIR ${GOPATH}/src/gitlab.com/pztrn/flagger
|
||||
- cd ${GOPATH}/src/gitlab.com/pztrn/flagger
|
||||
|
||||
test:1.11:
|
||||
image: golang:1.11.3
|
||||
stage: test
|
||||
tags:
|
||||
- docker
|
||||
script:
|
||||
- pwd
|
||||
- go test -test.v -cover ./...
|
||||
|
||||
test:1.10:
|
||||
image: golang:1.10.6
|
||||
stage: test
|
||||
tags:
|
||||
- docker
|
||||
script:
|
||||
- pwd
|
||||
- go test -test.v -cover ./...
|
||||
|
||||
test:1.9:
|
||||
image: golang:1.9.7
|
||||
stage: test
|
||||
tags:
|
||||
- docker
|
||||
script:
|
||||
- pwd
|
||||
- go test -test.v -cover ./...
|
3
vendor/gitlab.com/pztrn/flagger/.travis.yml
generated
vendored
Normal file
3
vendor/gitlab.com/pztrn/flagger/.travis.yml
generated
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
language: go
|
||||
go:
|
||||
- "1.10"
|
36
vendor/gitlab.com/pztrn/flagger/Gopkg.lock
generated
vendored
Normal file
36
vendor/gitlab.com/pztrn/flagger/Gopkg.lock
generated
vendored
Normal file
@@ -0,0 +1,36 @@
|
||||
# This file is autogenerated, do not edit; changes may be undone by the next 'dep ensure'.
|
||||
|
||||
|
||||
[[projects]]
|
||||
digest = "1:ffe9824d294da03b391f44e1ae8281281b4afc1bdaa9588c9097785e3af10cec"
|
||||
name = "github.com/davecgh/go-spew"
|
||||
packages = ["spew"]
|
||||
pruneopts = "UT"
|
||||
revision = "8991bc29aa16c548c550c7ff78260e27b9ab7c73"
|
||||
version = "v1.1.1"
|
||||
|
||||
[[projects]]
|
||||
digest = "1:0028cb19b2e4c3112225cd871870f2d9cf49b9b4276531f03438a88e94be86fe"
|
||||
name = "github.com/pmezard/go-difflib"
|
||||
packages = ["difflib"]
|
||||
pruneopts = "UT"
|
||||
revision = "792786c7400a136282c1664665ae0a8db921c6c2"
|
||||
version = "v1.0.0"
|
||||
|
||||
[[projects]]
|
||||
digest = "1:c40d65817cdd41fac9aa7af8bed56927bb2d6d47e4fea566a74880f5c2b1c41e"
|
||||
name = "github.com/stretchr/testify"
|
||||
packages = [
|
||||
"assert",
|
||||
"require",
|
||||
]
|
||||
pruneopts = "UT"
|
||||
revision = "f35b8ab0b5a2cef36673838d662e249dd9c94686"
|
||||
version = "v1.2.2"
|
||||
|
||||
[solve-meta]
|
||||
analyzer-name = "dep"
|
||||
analyzer-version = 1
|
||||
input-imports = ["github.com/stretchr/testify/require"]
|
||||
solver-name = "gps-cdcl"
|
||||
solver-version = 1
|
34
vendor/gitlab.com/pztrn/flagger/Gopkg.toml
generated
vendored
Normal file
34
vendor/gitlab.com/pztrn/flagger/Gopkg.toml
generated
vendored
Normal file
@@ -0,0 +1,34 @@
|
||||
# Gopkg.toml example
|
||||
#
|
||||
# Refer to https://golang.github.io/dep/docs/Gopkg.toml.html
|
||||
# for detailed Gopkg.toml documentation.
|
||||
#
|
||||
# required = ["github.com/user/thing/cmd/thing"]
|
||||
# ignored = ["github.com/user/project/pkgX", "bitbucket.org/user/project/pkgA/pkgY"]
|
||||
#
|
||||
# [[constraint]]
|
||||
# name = "github.com/user/project"
|
||||
# version = "1.0.0"
|
||||
#
|
||||
# [[constraint]]
|
||||
# name = "github.com/user/project2"
|
||||
# branch = "dev"
|
||||
# source = "github.com/myfork/project2"
|
||||
#
|
||||
# [[override]]
|
||||
# name = "github.com/x/y"
|
||||
# version = "2.4.0"
|
||||
#
|
||||
# [prune]
|
||||
# non-go = false
|
||||
# go-tests = true
|
||||
# unused-packages = true
|
||||
|
||||
|
||||
[[constraint]]
|
||||
name = "github.com/stretchr/testify"
|
||||
version = "1.2.2"
|
||||
|
||||
[prune]
|
||||
go-tests = true
|
||||
unused-packages = true
|
7
vendor/gitlab.com/pztrn/flagger/LICENSE
generated
vendored
Normal file
7
vendor/gitlab.com/pztrn/flagger/LICENSE
generated
vendored
Normal file
@@ -0,0 +1,7 @@
|
||||
Copyright 2017-2018, Stanislav N. aka pztrn
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
63
vendor/gitlab.com/pztrn/flagger/README.md
generated
vendored
Normal file
63
vendor/gitlab.com/pztrn/flagger/README.md
generated
vendored
Normal file
@@ -0,0 +1,63 @@
|
||||
[](https://godoc.org/gitlab.com/pztrn/flagger)
|
||||
|
||||
# Flagger
|
||||
|
||||
Flagger is an arbitrary CLI flags parser, like argparse in Python.
|
||||
Flagger is able to parse boolean, integer and string flags.
|
||||
|
||||
# Installation
|
||||
|
||||
```
|
||||
go get -u -v lab.pztrn.name/golibs/flagger
|
||||
```
|
||||
|
||||
# Usage
|
||||
|
||||
Flagger requires logging interface to be passed on initialization.
|
||||
See ``loggerinterface.go`` for required logging functions.
|
||||
It is able to run with standart log package, in that case
|
||||
initialize flagger like:
|
||||
|
||||
```
|
||||
flgr = flagger.New("My Super Program", flagger.LoggerInterface(log.New(os.Stdout, "testing logger: ", log.Lshortfile)))
|
||||
flgr.Initialize()
|
||||
```
|
||||
|
||||
Adding a flag is easy, just fill ``Flag`` structure and pass to ``AddFlag()`` call:
|
||||
|
||||
```
|
||||
flag_bool := Flag{
|
||||
Name: "boolflag",
|
||||
Description: "Boolean flag",
|
||||
Type: "bool",
|
||||
DefaultValue: true,
|
||||
}
|
||||
err := flgr.AddFlag(&flag_bool)
|
||||
if err != nil {
|
||||
...
|
||||
}
|
||||
```
|
||||
|
||||
After adding all neccessary flags you should issue ``Parse()`` call to get
|
||||
them parsed:
|
||||
|
||||
```
|
||||
flgr.Parse()
|
||||
```
|
||||
|
||||
After parsed they can be obtained everywhere you want, like:
|
||||
|
||||
```
|
||||
val, err := flgr.GetBoolValue("boolflag")
|
||||
if err != nil {
|
||||
...
|
||||
}
|
||||
```
|
||||
|
||||
For more examples take a look at ``flagger_test.go`` file or [at GoDoc](https://godoc.org/gitlab.com/pztrn/flagger).
|
||||
|
||||
# Get help
|
||||
|
||||
If you want to report a bug - feel free to report it via Gitlab's issues system. Note that everything that isn't a bug report or feature request will be closed without any futher comments.
|
||||
|
||||
If you want to request some help (without warranties), propose a feature request or discuss flagger in any way - please use our mailing lists at flagger@googlegroups.com. To be able to send messages there you should subscribe by sending email to ``flagger+subscribe@googlegroups.com``, subject and mail body can be random.
|
50
vendor/gitlab.com/pztrn/flagger/exported.go
generated
vendored
Normal file
50
vendor/gitlab.com/pztrn/flagger/exported.go
generated
vendored
Normal file
@@ -0,0 +1,50 @@
|
||||
// Flagger - arbitrary CLI flags parser.
|
||||
//
|
||||
// Copyright (c) 2017-2018, Stanislav N. aka pztrn.
|
||||
//
|
||||
// Permission is hereby granted, free of charge, to any person obtaining
|
||||
// a copy of this software and associated documentation files (the
|
||||
// "Software"), to deal in the Software without restriction, including
|
||||
// without limitation the rights to use, copy, modify, merge, publish,
|
||||
// distribute, sublicense, and/or sell copies of the Software, and to
|
||||
// permit persons to whom the Software is furnished to do so, subject
|
||||
// to the following conditions:
|
||||
//
|
||||
// The above copyright notice and this permission notice shall be
|
||||
// included in all copies or substantial portions of the Software.
|
||||
//
|
||||
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
|
||||
// EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
|
||||
// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
|
||||
// IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
|
||||
// CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
|
||||
// TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE
|
||||
// OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
||||
|
||||
package flagger
|
||||
|
||||
import (
|
||||
// stdlib
|
||||
"log"
|
||||
"os"
|
||||
)
|
||||
|
||||
var (
|
||||
logger LoggerInterface
|
||||
applicationName string
|
||||
)
|
||||
|
||||
// New creates new Flagger instance.
|
||||
// If no logger will be passed - we will use default "log" module and will
|
||||
// print logs to stdout.
|
||||
func New(appName string, l LoggerInterface) *Flagger {
|
||||
applicationName = appName
|
||||
if l == nil {
|
||||
lg := log.New(os.Stdout, "Flagger: ", log.LstdFlags)
|
||||
logger = LoggerInterface(lg)
|
||||
} else {
|
||||
logger = l
|
||||
}
|
||||
f := Flagger{}
|
||||
return &f
|
||||
}
|
36
vendor/gitlab.com/pztrn/flagger/flag.go
generated
vendored
Normal file
36
vendor/gitlab.com/pztrn/flagger/flag.go
generated
vendored
Normal file
@@ -0,0 +1,36 @@
|
||||
// Flagger - arbitrary CLI flags parser.
|
||||
//
|
||||
// Copyright (c) 2017-2018, Stanislav N. aka pztrn.
|
||||
//
|
||||
// Permission is hereby granted, free of charge, to any person obtaining
|
||||
// a copy of this software and associated documentation files (the
|
||||
// "Software"), to deal in the Software without restriction, including
|
||||
// without limitation the rights to use, copy, modify, merge, publish,
|
||||
// distribute, sublicense, and/or sell copies of the Software, and to
|
||||
// permit persons to whom the Software is furnished to do so, subject
|
||||
// to the following conditions:
|
||||
//
|
||||
// The above copyright notice and this permission notice shall be
|
||||
// included in all copies or substantial portions of the Software.
|
||||
//
|
||||
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
|
||||
// EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
|
||||
// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
|
||||
// IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
|
||||
// CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
|
||||
// TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE
|
||||
// OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
||||
|
||||
package flagger
|
||||
|
||||
// Flag represents addable flag for Flagger.
|
||||
type Flag struct {
|
||||
// Flag name. It will be accessible using this name later.
|
||||
Name string
|
||||
// Description for help output.
|
||||
Description string
|
||||
// Type can be one of "bool", "int", "string".
|
||||
Type string
|
||||
// This value will be reflected.
|
||||
DefaultValue interface{}
|
||||
}
|
138
vendor/gitlab.com/pztrn/flagger/flagger.go
generated
vendored
Normal file
138
vendor/gitlab.com/pztrn/flagger/flagger.go
generated
vendored
Normal file
@@ -0,0 +1,138 @@
|
||||
// Flagger - arbitrary CLI flags parser.
|
||||
//
|
||||
// Copyright (c) 2017-2018, Stanislav N. aka pztrn.
|
||||
//
|
||||
// Permission is hereby granted, free of charge, to any person obtaining
|
||||
// a copy of this software and associated documentation files (the
|
||||
// "Software"), to deal in the Software without restriction, including
|
||||
// without limitation the rights to use, copy, modify, merge, publish,
|
||||
// distribute, sublicense, and/or sell copies of the Software, and to
|
||||
// permit persons to whom the Software is furnished to do so, subject
|
||||
// to the following conditions:
|
||||
//
|
||||
// The above copyright notice and this permission notice shall be
|
||||
// included in all copies or substantial portions of the Software.
|
||||
//
|
||||
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
|
||||
// EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
|
||||
// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
|
||||
// IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
|
||||
// CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
|
||||
// TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE
|
||||
// OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
||||
|
||||
package flagger
|
||||
|
||||
import (
|
||||
// stdlib
|
||||
"errors"
|
||||
"flag"
|
||||
"os"
|
||||
"sync"
|
||||
)
|
||||
|
||||
// Flagger implements (kinda) extended CLI parameters parser. As it
|
||||
// available from CommonContext, these flags will be available to
|
||||
// whole application.
|
||||
//
|
||||
// It uses reflection to determine what kind of variable we should
|
||||
// parse or get.
|
||||
type Flagger struct {
|
||||
// Flags that was added by user.
|
||||
flags map[string]*Flag
|
||||
flagsMutex sync.Mutex
|
||||
|
||||
// Flags that will be passed to flag module.
|
||||
flagsBool map[string]*bool
|
||||
flagsInt map[string]*int
|
||||
flagsString map[string]*string
|
||||
|
||||
flagSet *flag.FlagSet
|
||||
}
|
||||
|
||||
// AddFlag adds flag to list of flags we will pass to ``flag`` package.
|
||||
func (f *Flagger) AddFlag(flag *Flag) error {
|
||||
_, present := f.flags[flag.Name]
|
||||
if present {
|
||||
return errors.New("Cannot add flag '" + flag.Name + "' - already added!")
|
||||
}
|
||||
|
||||
f.flags[flag.Name] = flag
|
||||
return nil
|
||||
}
|
||||
|
||||
// GetBoolValue returns boolean value for flag with given name.
|
||||
// Returns bool value for flag and nil as error on success
|
||||
// and false bool plus error with text on error.
|
||||
func (f *Flagger) GetBoolValue(name string) (bool, error) {
|
||||
fl, present := f.flagsBool[name]
|
||||
if !present {
|
||||
return false, errors.New("No such flag: " + name)
|
||||
}
|
||||
return (*fl), nil
|
||||
}
|
||||
|
||||
// GetIntValue returns integer value for flag with given name.
|
||||
// Returns integer on success and 0 on error.
|
||||
func (f *Flagger) GetIntValue(name string) (int, error) {
|
||||
fl, present := f.flagsInt[name]
|
||||
if !present {
|
||||
return 0, errors.New("No such flag: " + name)
|
||||
}
|
||||
return (*fl), nil
|
||||
}
|
||||
|
||||
// GetStringValue returns string value for flag with given name.
|
||||
// Returns string on success or empty string on error.
|
||||
func (f *Flagger) GetStringValue(name string) (string, error) {
|
||||
fl, present := f.flagsString[name]
|
||||
if !present {
|
||||
return "", errors.New("No such flag: " + name)
|
||||
}
|
||||
return (*fl), nil
|
||||
}
|
||||
|
||||
// Initialize initializes Flagger.
|
||||
func (f *Flagger) Initialize() {
|
||||
logger.Print("Initializing CLI parameters parser...")
|
||||
|
||||
f.flags = make(map[string]*Flag)
|
||||
|
||||
f.flagsBool = make(map[string]*bool)
|
||||
f.flagsInt = make(map[string]*int)
|
||||
f.flagsString = make(map[string]*string)
|
||||
|
||||
f.flagSet = flag.NewFlagSet(applicationName, flag.ContinueOnError)
|
||||
}
|
||||
|
||||
// Parse adds flags from flags map to flag package and parse
|
||||
// them. They can be obtained later by calling GetTYPEValue(name),
|
||||
// where TYPE is one of Bool, Int, String.
|
||||
func (f *Flagger) Parse() {
|
||||
// If flags was already parsed - do nothing.
|
||||
if f.flagSet.Parsed() {
|
||||
return
|
||||
}
|
||||
|
||||
for name, fl := range f.flags {
|
||||
if fl.Type == "bool" {
|
||||
fdef := fl.DefaultValue.(bool)
|
||||
f.flagsBool[name] = &fdef
|
||||
f.flagSet.BoolVar(&fdef, name, fdef, fl.Description)
|
||||
} else if fl.Type == "int" {
|
||||
fdef := fl.DefaultValue.(int)
|
||||
f.flagsInt[name] = &fdef
|
||||
f.flagSet.IntVar(&fdef, name, fdef, fl.Description)
|
||||
} else if fl.Type == "string" {
|
||||
fdef := fl.DefaultValue.(string)
|
||||
f.flagsString[name] = &fdef
|
||||
f.flagSet.StringVar(&fdef, name, fdef, fl.Description)
|
||||
}
|
||||
}
|
||||
|
||||
logger.Print("Parsing CLI parameters:", os.Args)
|
||||
err := f.flagSet.Parse(os.Args[1:])
|
||||
if err != nil {
|
||||
os.Exit(0)
|
||||
}
|
||||
}
|
17
vendor/gitlab.com/pztrn/flagger/gitlab-ci.yml
generated
vendored
Normal file
17
vendor/gitlab.com/pztrn/flagger/gitlab-ci.yml
generated
vendored
Normal file
@@ -0,0 +1,17 @@
|
||||
image: golang:1.11.1
|
||||
|
||||
stages:
|
||||
- test
|
||||
|
||||
before_script:
|
||||
- mkdir -p ${GOPATH}/src/gitlab.com/pztrn/
|
||||
- ln -s $CI_PROJECT_DIR ${GOPATH}/src/gitlab.com/pztrn/flagger
|
||||
- cd ${GOPATH}/src/gitlab.com/pztrn/flagger
|
||||
|
||||
test:
|
||||
stage: test
|
||||
tags:
|
||||
- docker
|
||||
script:
|
||||
- pwd
|
||||
- go test -test.v -cover ./...
|
31
vendor/gitlab.com/pztrn/flagger/loggerinterface.go
generated
vendored
Normal file
31
vendor/gitlab.com/pztrn/flagger/loggerinterface.go
generated
vendored
Normal file
@@ -0,0 +1,31 @@
|
||||
// Flagger - arbitrary CLI flags parser.
|
||||
//
|
||||
// Copyright (c) 2017-2018, Stanislav N. aka pztrn.
|
||||
//
|
||||
// Permission is hereby granted, free of charge, to any person obtaining
|
||||
// a copy of this software and associated documentation files (the
|
||||
// "Software"), to deal in the Software without restriction, including
|
||||
// without limitation the rights to use, copy, modify, merge, publish,
|
||||
// distribute, sublicense, and/or sell copies of the Software, and to
|
||||
// permit persons to whom the Software is furnished to do so, subject
|
||||
// to the following conditions:
|
||||
//
|
||||
// The above copyright notice and this permission notice shall be
|
||||
// included in all copies or substantial portions of the Software.
|
||||
//
|
||||
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
|
||||
// EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
|
||||
// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
|
||||
// IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
|
||||
// CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
|
||||
// TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE
|
||||
// OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
||||
|
||||
package flagger
|
||||
|
||||
// LoggerInterface provide logging interface, so everyone can inject own
|
||||
// logging handlers.
|
||||
type LoggerInterface interface {
|
||||
Fatal(v ...interface{})
|
||||
Print(v ...interface{})
|
||||
}
|
Reference in New Issue
Block a user