twitter-scraper alternatives and similar packages
Based on the "Third-party APIs" category.
Alternatively, view twitter-scraper alternatives based on common mentions on social networks and blogs.
-
aws-sdk-go
AWS SDK for the Go programming language (In Maintenance Mode, End-of-Life on 07/31/2025). The AWS SDK for Go v2 is available here: https://github.com/aws/aws-sdk-go-v2 -
githubql
Package githubv4 is a client library for accessing GitHub GraphQL API v4 (https://docs.github.com/en/graphql). -
openaigo
OpenAI GPT3/3.5 and GPT4 ChatGPT API Client Library for Go, simple, less dependencies, and well-tested -
gostorm
GoStorm is a Go library that implements the communications protocol required to write Storm spouts and Bolts in Go that communicate with the Storm shells. -
ynab
Go client for the YNAB API. Unofficial. It covers 100% of the resources made available by the YNAB API.
CodeRabbit: AI Code Reviews for Developers
Do you think we are missing an alternative of twitter-scraper or a related project?
README
Twitter Scraper
Twitter's API is annoying to work with, and has lots of limitations — luckily their frontend (JavaScript) has it's own API, which I reverse-engineered. No API rate limits. No tokens needed. No restrictions. Extremely fast.
You can use this library to get the text of any user's Tweets trivially.
Installation
go get -u github.com/n0madic/twitter-scraper
Usage
Get user tweets
package main
import (
"context"
"fmt"
twitterscraper "github.com/n0madic/twitter-scraper"
)
func main() {
scraper := twitterscraper.New()
for tweet := range scraper.GetTweets(context.Background(), "Twitter", 50) {
if tweet.Error != nil {
panic(tweet.Error)
}
fmt.Println(tweet.Text)
}
}
It appears you can ask for up to 50 tweets (limit ~3200 tweets).
Get single tweet
package main
import (
"fmt"
twitterscraper "github.com/n0madic/twitter-scraper"
)
func main() {
scraper := twitterscraper.New()
tweet, err := scraper.GetTweet("1328684389388185600")
if err != nil {
panic(err)
}
fmt.Println(tweet.Text)
}
Search tweets by query standard operators
Tweets containing “twitter” and “scraper” and “data“, filtering out retweets:
package main
import (
"context"
"fmt"
twitterscraper "github.com/n0madic/twitter-scraper"
)
func main() {
scraper := twitterscraper.New()
for tweet := range scraper.SearchTweets(context.Background(),
"twitter scraper data -filter:retweets", 50) {
if tweet.Error != nil {
panic(tweet.Error)
}
fmt.Println(tweet.Text)
}
}
The search ends if we have 50 tweets.
See Rules and filtering for build standard queries.
Set search mode
scraper.SetSearchMode(twitterscraper.SearchLatest)
Options:
twitterscraper.SearchTop
- default modetwitterscraper.SearchLatest
- live modetwitterscraper.SearchPhotos
- image modetwitterscraper.SearchVideos
- video modetwitterscraper.SearchUsers
- user mode
Get profile
package main
import (
"fmt"
twitterscraper "github.com/n0madic/twitter-scraper"
)
func main() {
scraper := twitterscraper.New()
profile, err := scraper.GetProfile("Twitter")
if err != nil {
panic(err)
}
fmt.Printf("%+v\n", profile)
}
Search profiles by query
package main
import (
"context"
"fmt"
twitterscraper "github.com/n0madic/twitter-scraper"
)
func main() {
scraper := twitterscraper.New().SetSearchMode(twitterscraper.SearchUsers)
for profile := range scraper.SearchProfiles(context.Background(), "Twitter", 50) {
if profile.Error != nil {
panic(profile.Error)
}
fmt.Println(profile.Name)
}
}
Get trends
package main
import (
"fmt"
twitterscraper "github.com/n0madic/twitter-scraper"
)
func main() {
scraper := twitterscraper.New()
trends, err := scraper.GetTrends()
if err != nil {
panic(err)
}
fmt.Println(trends)
}
Use cookie authentication
Some specified user tweets are protected that you must login and follow. Cookie and xCsrfToken is optional.
scraper.WithCookie("twitter cookie after login")
scraper.WithXCsrfToken("twitter X-Csrf-Token after login")
Use Proxy
Support HTTP(s) and SOCKS5 proxy
with HTTP
err := scraper.SetProxy("http://localhost:3128")
if err != nil {
panic(err)
}
with SOCKS5
err := scraper.SetProxy("socks5://localhost:1080")
if err != nil {
panic(err)
}
Delay requests
Add delay between API requests (in seconds)
scraper.WithDelay(5)
Load timeline with tweet replies
scraper.WithReplies(true)
Default Scraper (Ad hoc)
In simple cases, you can use the default scraper without creating an object instance
import twitterscraper "github.com/n0madic/twitter-scraper"
// for tweets
twitterscraper.GetTweets(context.Background(), "Twitter", 50)
// for tweets with replies
twitterscraper.WithReplies(true).GetTweets(context.Background(), "Twitter", 50)
// for search
twitterscraper.SearchTweets(context.Background(), "twitter", 50)
// for profile
twitterscraper.GetProfile("Twitter")
// for trends
twitterscraper.GetTrends()