chris alternatives and similar packages
Based on the "Software Packages" category.
Alternatively, view chris alternatives based on common mentions on social networks and blogs.
-
Moby
The Moby Project - a collaborative project for the container ecosystem to assemble container-based systems -
Gitea
Git with a cup of tea! Painless self-hosted all-in-one software development service, including Git hosting, code review, team collaboration, package registry and CI/CD -
Seaweed File System
DISCONTINUED. SeaweedFS is a fast distributed storage system for blobs, objects, files, and data lake, for billions of files! Blob store has O(1) disk seek, cloud tiering. Filer supports Cloud Drive, cross-DC active-active replication, Kubernetes, POSIX FUSE mount, S3 API, S3 Gateway, Hadoop, WebDAV, encryption, Erasure Coding. [Moved to: https://github.com/seaweedfs/seaweedfs] -
Packer
Packer is a tool for creating identical machine images for multiple platforms from a single source configuration. -
Gor
GoReplay is an open-source tool for capturing and replaying live HTTP traffic into a test environment in order to continuously test your system with real data. It can be used to increase confidence in code deployments, configuration changes and infrastructure changes. -
rkt
DISCONTINUED. An App Container runtime that integrates with init systems, is compatible with other container formats like Docker, and supports alternative execution engines like KVM. -
toxiproxy
:alarm_clock: :fire: A TCP proxy to simulate network and system conditions for chaos and resiliency testing -
kubeshark
The API traffic analyzer for Kubernetes providing real-time K8s protocol-level visibility, capturing and monitoring all traffic and payloads going in, out and across containers, pods, nodes and clusters. Inspired by Wireshark, purposely built for Kubernetes -
Ddosify
Anteon (formerly Ddosify) - Effortless Kubernetes Monitoring and Performance Testing. Available on CLI, Self-Hosted, and Cloud -
dasel
Select, put and delete data from JSON, TOML, YAML, XML and CSV files with a single tool. Supports conversion between formats and can be used as a Go package. -
scc
Sloc, Cloc and Code: scc is a very fast accurate code counter with complexity calculations and COCOMO estimates written in pure Go -
Mizu
DISCONTINUED. The API traffic viewer for Kubernetes providing deep visibility into all API traffic and payloads going in, out and across containers and pods inside a Kubernetes cluster. Think TCPDump and Wireshark re-invented for Kubernetes [Moved to: https://github.com/kubeshark/kubeshark] -
Pomerium
Pomerium is an identity and context-aware reverse proxy for zero-trust access to web applications and services.
InfluxDB - Purpose built for real-time analytics at any scale.
* Code Quality Rankings and insights are calculated and provided by Lumnify.
They vary from L1 to L5 with "L5" being the highest.
Do you think we are missing an alternative of chris or a related project?
Popular Comparisons
README
chris
Pratt parser implementation in Go for parsing mathematical equations
The core implementation details follows the advice by Bob Nystrom detailed in his article on Pratt parsing
My notes on Pratt parsing and this project can be found here.
chris
hopes to allow for user input mathematical equations that can be parsed and compiled into valid Go functions
that can be used with plotting libraries in Go like gonum/plot
. However, there are many other ways to use such a
library.
Sample
chris
supports most mathematical equations that Desmos supports. Additional operators will be added down the line. To
view the current operators, refer here.
1 + 2 * 3 := 1 + (2 * 3)
sin(pi/4) := sin((pi/4))
2^x + cos(pi/4 + 15) := (2^x) + cos(((pi/4) + 15))
Usage
To use chris
in your own project, download it as a package in Go modules:
go get github.com/woojiahao/chris
To set up a basic compiler, we will use both the lexer
and parser
modules. The lexer
generates the token stream
and the parser
will be able to parse that token stream into a given Abstract Syntax Tree (AST). For more information
about the roles of either component, refer below.
lexer
receives a keyword and constant list to determine how these tokens are tokenized.
parser
only requires the lexer
to generate the AST. To retrieve the AST, we simply call parser#Parse
.
package compiler
import (
"fmt"
"github.com/woojiahao/chris/pkg/lexer"
"github.com/woojiahao/chris/pkg/parser"
)
type Compiler struct {
l *lexer.Lexer
p *parser.Parser
}
func New(exp string) *Compiler {
keywords := []string{"sin", "cos", "tan", "csc", "sec", "cot"}
constants := []string{"pi"}
l := lexer.New(exp, keywords, constants)
p := parser.New(l)
// Parse expression and get AST. We ignore the err for now
ast, _ := p.Parse()
fmt.Printf("AST: %v\n", ast)
return &Compiler{l, p}
}
Refer to example/
for a sample compiler which parses the equation and generates a function of
type func(float64) float64
that can be used in plotting libraries like gonum/plot
.
Architecture
The general architecture of a programming language compiler can be found here:
flowchart LR
Lexer-->Parser-->Compiler
Lexer
- acts as an iterator over a given expression and converts each character/word into a given token. It ignores whitespaces and will parse numbers and words as a whole chunk.Parser
- reads the token stream from a given Lexer and applies grammar to the tokens to generate an AST tree. It is not responsible for checking if the keywords are valid. It just needs to know that the expression can generate a valid AST tree.Compiler
- receives the generated AST tree from the Parser and performs operations on the given AST tree and the respective nodes.chris
, however is not a compiler, but a parser, so it will not compile the given AST.
Parselets
Parser logic is performed by something known as "Parselets". Effectively, they are the components that handles behavior of each token. This is slightly different to having functions per non-terminal character in our grammar.
We have two kinds of parselets, prefix and infix. Prefix parselets are what can start an independent sub-expression like
numbers, (
or variables, while infix parselets require a left and right sub-expression to generate a node.
Operators/Symbols
Symbol | Purpose | Position | Precedence |
---|---|---|---|
+ | Addition | Infix | 2 |
- | Subtraction | Prefix/Infix | 2 |
* | Multiplication | Infix | 3 |
/ | Division | Infix | 3 |
^ | Exponent | Infix | 4 |
( | Create sub-expression or encapsulate a function's arguments | Prefix/Infix | 5 |
) | End sub-expression | - | -1 |
= | Assignment | Infix | 1 |
<keyword> | Keyword that corresponds to a function | Infix | 1 |
<number> | Number | Prefix | 1 |
<variable> | Single character to represent a variable | Prefix | 1 |
<constant> | User-specified constant | Prefix | 1 |
BNF
# chris BNF
# General terminals
<digit> ::= '0' | ... | '9'
<letter> ::= 'a' | ... | 'z'
| 'A' | ... | 'Z'
# Terminals in chris
<number> ::= <digit>
| <digit>'.'<digit>
| <number><digit>
<variable> ::= <letter>
<keyword> ::= <letter>+
# Non-terminals
<operator> ::= '+' | '-' | '*' | '/' | '^'
<unary> ::= '-'
<expression> ::= <number>
| <variable>
| <keyword>
| <unary> <expression>
| <expression> <operator> <expression>
| <expression> <expression>
| <function call>
| <group>
<function call> ::= <keyword> '(' <expression>* ')'
<group> ::= '(' <expression> ')'
<assignment> ::= <variable> '=' <expression>