scan alternatives and similar packages
Based on the "Utilities" category.
Alternatively, view scan alternatives based on common mentions on social networks and blogs.
-
hub
wrap git commands with additional functionality to interact with github from the terminal. -
excelize
Golang library for reading and writing Microsoft Excel (XLSX) files. -
xlsx
Library to simplify reading the XML format used by recent version of Microsoft Excel in Go programs. -
godropbox
Common libraries for writing Go services/applications from Dropbox. -
godotenv
A Go port of Ruby's dotenv library (Loads environment variables from .env.) -
hystrix-go
Implements Hystrix patterns of programmer-defined fallbacks aka circuit breaker. -
goreporter
A Golang tool that does static analysis, unit testing, code review and generate code quality report. -
go-funk
A modern Go utility library which provides helpers (map, find, contains, filter, chunk, reverse, ...) -
gojson
Automatically generate Go (golang) struct definitions from example JSON. -
mc
Minio Client provides minimal tools to work with Amazon S3 compatible cloud storage and filesystems. -
grequests
An elegant and simple net/http wrapper that follows Python's requests library -
mergo
A helper to merge structs and maps in Golang. Useful for configuration default values, avoiding messy if-statements. -
filetype
Small package to infer the file type checking the magic numbers signature. -
boilr
A blazingly fast CLI tool for creating projects from boilerplate templates. -
go-underscore
A useful collection of helpfully functional Go collection utilities. -
beaver
Beaver is a real-time messaging server. With beaver you can easily build scalable in-app notifications, realtime graphs, multiplayer games, chat applications, geotracking and more in web applications and mobile apps. -
JobRunner
Smart and featureful cron job scheduler with job queuing and live monitoring built in. -
httpcontrol
Package httpcontrol allows for HTTP transport level control around timeouts and retries.
Scout APM - Leading-edge performance monitoring starting at $39/month
Do you think we are missing an alternative of scan or a related project?
Popular Comparisons
README
Scan
Scan provides the ability to use database/sql/rows to scan datasets directly to structs or slices. For the most comprehensive and up-to-date docs see the godoc
Examples
Multiple Rows
db, err := sql.Open("sqlite3", "database.sqlite")
rows, err := db.Query("SELECT * FROM persons")
var persons []Person
err := scan.Rows(&persons, rows)
fmt.Printf("%#v", persons)
// []Person{
// {ID: 1, Name: "brett"},
// {ID: 2, Name: "fred"},
// {ID: 3, Name: "stacy"},
// }
Multiple rows of primitive type
rows, err := db.Query("SELECT name FROM persons")
var names []string
err := scan.Rows(&names, rows)
fmt.Printf("%#v", names)
// []string{
// "brett",
// "fred",
// "stacy",
// }
Single row
rows, err := db.Query("SELECT * FROM persons where name = 'brett' LIMIT 1")
var person Person
err := scan.Row(&person, rows)
fmt.Printf("%#v", person)
// Person{ ID: 1, Name: "brett" }
Scalar value
rows, err := db.Query("SELECT age FROM persons where name = 'brett' LIMIT 1")
var age int8
err := scan.Row(&age, row)
fmt.Printf("%d", age)
// 100
Strict Scanning
Both Rows
and Row
have strict alternatives to allow scanning to structs strictly based on their db
tag.
To avoid unwanted behavior you can use RowsStrict
or RowStrict
to scan without using field names.
Any fields not tagged with the db
tag will be ignored even if columns are found that match the field names.
Columns
Columns
scans a struct and returns a string slice of the assumed column names based on the db
tag or the struct field name respectively. To avoid assumptions, use ColumnsStrict
which will only return the fields tagged with the db
tag. Both Columns
and ColumnsStrict
are variadic. They both accept a string slice of column names to exclude from the list. It is recommended that you cache this slice.
package main
type User struct {
ID int64
Name string
Age int
BirthDate string `db:"bday"`
Zipcode string `db:"-"`
Store struct {
ID int
// ...
}
}
var nobody = new(User)
var userInsertCols = scan.Columns(nobody, "ID")
// []string{ "Name", "Age", "bday" }
var userSelectCols = scan.Columns(nobody)
// []string{ "ID", "Name", "Age", "bday" }
Values
Values
scans a struct and returns the values associated with the provided columns. Values uses a sync.Map
to cache fields of structs to greatly improve the performance of scanning types. The first time a struct is scanned it's exported fields locations are cached. When later retrieving values from the same struct it should be much faster. See Benchmarks below.
user := &User{
ID: 1,
Name: "Brett",
Age: 100,
}
vals := scan.Values([]string{"ID", "Name"}, user)
// []interface{}{ 1, "Brett" }
I find that the usefulness of both Values and Columns lies within using a library such as sq.
sq.Insert(userCols...).
Into("users").
Values(scan.Values(userCols, &user)...)
Configuration
AutoClose: Automatically call rows.Close()
after scan completes (default true)
Why
While many other projects support similar features (i.e. sqlx) scan allows you to use any database lib such as the stdlib or squirrel to write fluent SQL statements and pass the resulting rows
to scan
for scanning.
Benchmarks
λ go test -bench=. -benchtime=10s ./...
goos: linux
goarch: amd64
pkg: github.com/blockloop/scan
BenchmarkColumnsLargeStruct-8 50000000 272 ns/op
BenchmarkValuesLargeStruct-8 2000000 8611 ns/op
BenchmarkScanRowOneField-8 2000000 8528 ns/op
BenchmarkScanRowFiveFields-8 1000000 12234 ns/op
BenchmarkScanTenRowsOneField-8 1000000 16802 ns/op
BenchmarkScanTenRowsTenFields-8 100000 104587 ns/op
PASS
ok github.com/blockloop/scan 116.055s