A command-line utility and Python library to access the social share counts for a particular URL.
Usage:
socialshares <url> [<platforms>...] [options]
Options:
-h, --help Show this screen.
-p, --plain Plain output.
-r <attempts>, --retry <attempts> Retry fetching up to <attempt> times [default: 1]
-e, --exit Exit with an error code when not all counts could be fetched.
Some examples:
# fetch count for all supported platforms,
# try again once (the default) for platforms that fail
$ socialshares http://www.kalzumeus.com/2010/06/17/falsehoods-programmers-believe-about-names/
# fetch only facebook and twitter
$ socialshares http://www.theguardian.com/politics facebook twitter --retry 2
Platform | Description |
---|---|
twitter tweets and retweets containing the URL | |
facebook likes | |
facebookfql | facebook likes, shares and comments (in that order; deprecated but supported until mid-2016) |
linkedin shares | |
google +1's | |
pinterest pins | |
reddit ups and downs (summed across posts) |
Platforms are fetched in parallel and retried (once by default.) If no platforms are specified, just facebook and twitter will be returned.
By default, socialshares
outputs JSON:
{
"reddit": {
"downs": 0,
"ups": 6
},
"google": 20,
"facebook": 1498,
"twitter": 300,
"pinterest": 1
}
Use the --plain
flag if instead you'd like space-separated output.
$ socialshares http://www.theguardian.com/politics twitter
57
import socialshares
counts = socialshares.fetch(url, ['facebook', 'pinterest'])
pip install socialshares
# optionally, for asynchronous fetching
pip install grequests
If requests_futures and (for Python 2.x) futures
are installed, social-shares
will use these packages to speed up share count
fetching, by accessing the various social media APIs in parallel.