go-bqstreamer alternatives and similar packages
Based on the "Relational Databases" category.
Alternatively, view go-bqstreamer alternatives based on common mentions on social networks and blogs.
go-sql-driver/mysqlGo MySQL Driver is a MySQL driver for Go's (golang) database/sql package
sqlx9.7 0.0 go-bqstreamer VS sqlxgeneral purpose extensions to golang's database/sql
pq9.5 6.7 go-bqstreamer VS pqPure Go Postgres driver for database/sql
pgx9.4 5.6 go-bqstreamer VS pgxPostgreSQL driver and toolkit for Go
go-sqlite39.4 5.3 L3 go-bqstreamer VS go-sqlite3sqlite3 driver for go using database/sql
go-mssqldb8.4 2.5 go-bqstreamer VS go-mssqldbMicrosoft SQL server driver written in go language
go-oci87.3 0.0 go-bqstreamer VS go-oci8Oracle driver for Go using database/sql
godror6.7 7.5 go-bqstreamer VS godrorGO DRiver for ORacle DB
goracle6.3 1.3 go-bqstreamer VS goracleOracle driver for Go, using the ODPI-C driver.
firebirdsql5.7 3.7 go-bqstreamer VS firebirdsqlFirebird RDBMS sql driver for Go (golang)
gofreetds5.2 0.0 go-bqstreamer VS gofreetdsGo Sql Server database driver.
go-adodb5.1 2.5 go-bqstreamer VS go-adodbMicrosoft ActiveX Object DataBase driver for go that using exp/sql
Sqinn-Go4.5 0.0 go-bqstreamer VS Sqinn-GoSQLite with pure Go
vertica-sql-go4.3 4.7 go-bqstreamer VS vertica-sql-goOfficial native Go client for the Vertica Analytics Database.
avatica3.1 0.0 go-bqstreamer VS avaticaApache Phoenix/Avatica SQL driver for database/sql.
bgc2.9 0.0 go-bqstreamer VS bgcDatastore Connectivity for BigQuery in go
pig1.5 2.6 go-bqstreamer VS pigSimple pgx wrapper to execute and scan query results
Access the most powerful time series database as a service
* Code Quality Rankings and insights are calculated and provided by Lumnify.
They vary from L1 to L5 with "L5" being the highest.
Do you think we are missing an alternative of go-bqstreamer or a related project?
Kik and me (@oryband) are no longer maintaining this repository. Thanks for all the contributions. You are welcome to fork and continue development.
Stream insert data into BigQuery fast and concurrently,
- Insert rows from multiple tables, datasets, and projects, and insert them bulk. No need to manage data structures and sort rows by tables - bqstreamer does it for you.
- Multiple background workers (i.e. goroutines) to enqueue and insert rows.
- Insert can be done in a blocking or in the background (asynchronously).
- Perform insert operations in predefined set sizes, according to BigQuery's quota policy.
- Handle and retry BigQuery server errors.
- Backoff interval between failed insert operations.
- Error reporting.
- Production ready, and thoroughly tested. We - at Rounds (now acquired by Kik) - are using it in our data gathering workflow.
- Thorough testing and documentation for great good!
- Install Go, version should be at least 1.5.
- Clone this repository and download dependencies:
- Version v2:
go get gopkg.in/kikinteractive/go-bqstreamer.v2
- Version v1:
go get gopkg.in/kikinteractive/go-bqstreamer.v1
- Version v2:
- Acquire Google OAuth2/JWT credentials, so you can authenticate with BigQuery.
How Does It Work?
There are two types of inserters you can use:
SyncWorker, which is a single blocking (synchronous) worker.
- It enqueues rows and performs insert operations in a blocking manner.
AsyncWorkerGroup, which employes multiple background
AsyncWorkerGroupenqueues rows, and its background workers pull and insert in a fan-out model.
- An insert operation is executed according to row amount or time thresholds for each background worker.
- Errors are reported to an error channel for processing by the user.
- This provides a higher insert throughput for larger scale scenarios.
Check the GoDoc examples section.
- Please check the issues page.
- File new bugs and ask for improvements.
- Pull requests welcome!
# Run unit tests and check coverage. $ make test # Run integration tests. # This requires an active project, dataset and pem key. $ export BQSTREAMER_PROJECT=my-project $ export BQSTREAMER_DATASET=my-dataset $ export BQSTREAMER_TABLE=my-table $ export BQSTREAMER_KEY=my-key.json $ make testintegration