twitter-scraper alternatives and similar packages
Based on the "Third-party APIs" category.
Alternatively, view twitter-scraper alternatives based on common mentions on social networks and blogs.
-
telegram-bot-api
Golang bindings for the Telegram Bot API -
goamz
Popular fork of goamz which adds some missing API calls to certain packages. -
webhooks
:fishing_pole_and_fish: Webhook receiver for GitHub, Bitbucket, GitLab, Gogs -
githubql
Package githubv4 is a client library for accessing GitHub GraphQL API v4 (https://docs.github.com/en/graphql). -
geo-golang
Go library to access geocoding and reverse geocoding APIs -
lark
Feishu(飞书)/Lark Open API Go SDK, Support ALL Open API and Event Callback. -
VK SDK for Golang
Golang module for working with VK API -
gostorm
GoStorm is a Go library that implements the communications protocol required to write Storm spouts and Bolts in Go that communicate with the Storm shells. -
hipchat (xmpp)
A golang package to communicate with HipChat over XMPP -
clarifai
DEPRECATED: please use https://github.com/Clarifai/clarifai-go-grpc -
hipchat
This project implements a Go client library for the Hipchat API. -
go-trending
Go library for accessing trending repositories and developers at Github. -
go-lark
An easy-to-use SDK for Feishu and Lark Open Platform (Messaging API only) -
go-tgbot
Golang telegram bot API wrapper, session-based router and middleware -
cachet
Go(lang) client library for Cachet (open source status page system). -
simples3
Simple no frills AWS S3 Golang Library using REST with V4 Signing (without AWS Go SDK) -
go-postman-collection
Go module to work with Postman Collections -
ynab
Go client for the YNAB API. Unofficial. It covers 100% of the resources made available by the YNAB API. -
GoMusicBrainz
a Go (Golang) MusicBrainz WS2 client library - work in progress
Build time-series-based applications quickly and at scale.
Do you think we are missing an alternative of twitter-scraper or a related project?
README
Twitter Scraper
Twitter's API is annoying to work with, and has lots of limitations — luckily their frontend (JavaScript) has it's own API, which I reverse-engineered. No API rate limits. No tokens needed. No restrictions. Extremely fast.
You can use this library to get the text of any user's Tweets trivially.
Installation
go get -u github.com/n0madic/twitter-scraper
Usage
Get user tweets
package main
import (
"context"
"fmt"
twitterscraper "github.com/n0madic/twitter-scraper"
)
func main() {
scraper := twitterscraper.New()
for tweet := range scraper.GetTweets(context.Background(), "Twitter", 50) {
if tweet.Error != nil {
panic(tweet.Error)
}
fmt.Println(tweet.Text)
}
}
It appears you can ask for up to 50 tweets (limit ~3200 tweets).
Get single tweet
package main
import (
"fmt"
twitterscraper "github.com/n0madic/twitter-scraper"
)
func main() {
scraper := twitterscraper.New()
tweet, err := scraper.GetTweet("1328684389388185600")
if err != nil {
panic(err)
}
fmt.Println(tweet.Text)
}
Search tweets by query standard operators
Tweets containing “twitter” and “scraper” and “data“, filtering out retweets:
package main
import (
"context"
"fmt"
twitterscraper "github.com/n0madic/twitter-scraper"
)
func main() {
scraper := twitterscraper.New()
for tweet := range scraper.SearchTweets(context.Background(),
"twitter scraper data -filter:retweets", 50) {
if tweet.Error != nil {
panic(tweet.Error)
}
fmt.Println(tweet.Text)
}
}
The search ends if we have 50 tweets.
See Rules and filtering for build standard queries.
Set search mode
scraper.SetSearchMode(twitterscraper.SearchLatest)
Options:
twitterscraper.SearchTop
- default modetwitterscraper.SearchLatest
- live modetwitterscraper.SearchPhotos
- image modetwitterscraper.SearchVideos
- video modetwitterscraper.SearchUsers
- user mode
Get profile
package main
import (
"fmt"
twitterscraper "github.com/n0madic/twitter-scraper"
)
func main() {
scraper := twitterscraper.New()
profile, err := scraper.GetProfile("Twitter")
if err != nil {
panic(err)
}
fmt.Printf("%+v\n", profile)
}
Search profiles by query
package main
import (
"context"
"fmt"
twitterscraper "github.com/n0madic/twitter-scraper"
)
func main() {
scraper := twitterscraper.New().SetSearchMode(twitterscraper.SearchUsers)
for profile := range scraper.SearchProfiles(context.Background(), "Twitter", 50) {
if profile.Error != nil {
panic(profile.Error)
}
fmt.Println(profile.Name)
}
}
Get trends
package main
import (
"fmt"
twitterscraper "github.com/n0madic/twitter-scraper"
)
func main() {
scraper := twitterscraper.New()
trends, err := scraper.GetTrends()
if err != nil {
panic(err)
}
fmt.Println(trends)
}
Use cookie authentication
Some specified user tweets are protected that you must login and follow. Cookie and xCsrfToken is optional.
scraper.WithCookie("twitter cookie after login")
scraper.WithXCsrfToken("twitter X-Csrf-Token after login")
Use Proxy
Support HTTP(s) and SOCKS5 proxy
with HTTP
err := scraper.SetProxy("http://localhost:3128")
if err != nil {
panic(err)
}
with SOCKS5
err := scraper.SetProxy("socks5://localhost:1080")
if err != nil {
panic(err)
}
Delay requests
Add delay between API requests (in seconds)
scraper.WithDelay(5)
Load timeline with tweet replies
scraper.WithReplies(true)
Default Scraper (Ad hoc)
In simple cases, you can use the default scraper without creating an object instance
import twitterscraper "github.com/n0madic/twitter-scraper"
// for tweets
twitterscraper.GetTweets(context.Background(), "Twitter", 50)
// for tweets with replies
twitterscraper.WithReplies(true).GetTweets(context.Background(), "Twitter", 50)
// for search
twitterscraper.SearchTweets(context.Background(), "twitter", 50)
// for profile
twitterscraper.GetProfile("Twitter")
// for trends
twitterscraper.GetTrends()