Skip to main content

Scrape Facebook public pages without an API key

Project description

Facebook Scraper

Scrape Facebook public pages without an API key. Inspired by twitter-scraper.

Install

pip install facebook-scraper

Usage

Send the unique page name as the first parameter and you're good to go:

>>> from facebook_scraper import get_posts

>>> for post in get_posts('nintendo', pages=1):
...     print(post['text'][:50])
...
The final step on the road to the Super Smash Bros
Were headed to PAX East 3/28-3/31 with new games

Optional parameters

  • group: group id, to scrape groups instead of pages. Default is None.
  • pages: how many pages of posts to request, usually the first page has 2 posts and the rest 4. Default is 10.
  • timeout: how many seconds to wait before timing out. Default is 5.
  • sleep: how many seconds to sleep between each request. Default is 0.
  • credentials: tuple of user and password to login before requesting the posts. Default is None.
  • extra_info: bool, if true the function will try to do an extra request to get the post reactions. Default is False.

Post example

{'post_id': '2257188721032235',
 'text': 'Don’t let this diminutive version of the Hero of Time fool you, '
         'Young Link is just as heroic as his fully grown version! Young Link '
         'joins the Super Smash Bros. series of amiibo figures!',
 'time': datetime.datetime(2019, 4, 29, 12, 0, 1),
 'image': 'https://scontent.flim16-1.fna.fbcdn.net'
          '/v/t1.0-0/cp0/e15/q65/p320x320'
          '/58680860_2257182054366235_1985558733786185728_n.jpg'
          '?_nc_cat=1&_nc_ht=scontent.flim16-1.fna'
          '&oh=31b0ba32ec7886e95a5478c479ba1d38&oe=5D6CDEE4',
 'likes': 2036,
 'comments': 214,
 'shares': 0,
 'reactions': {'like': 135, 'love': 64, 'haha': 10, 'wow': 4, 'anger': 1},  # if `extra_info` was set
 'post_url': 'https://m.facebook.com/story.php'
             '?story_fbid=2257188721032235&id=119240841493711',
 'link': 'https://bit.ly/something'}

Notes

  • There is no guarantee that every field will be extracted (they might be None).
  • Shares doesn't seem to work at the moment.
  • Group posts are only from one page.

Alternatives and related projects

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

facebook-scraper-0.2.1a0.tar.gz (12.2 kB view details)

Uploaded Source

Built Distribution

facebook_scraper-0.2.1a0-py3-none-any.whl (12.6 kB view details)

Uploaded Python 3

File details

Details for the file facebook-scraper-0.2.1a0.tar.gz.

File metadata

  • Download URL: facebook-scraper-0.2.1a0.tar.gz
  • Upload date:
  • Size: 12.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.0.5 CPython/3.8.2 Linux/5.4.0-7626-generic

File hashes

Hashes for facebook-scraper-0.2.1a0.tar.gz
Algorithm Hash digest
SHA256 9a7a502c50fa4637964bc759aac435594cb9850065ee2028a66e3c164bb65c26
MD5 13295e8101efed0e6316ac31fd27752e
BLAKE2b-256 bf3cd9417b579a1a9d65bd2da20415f90232ca8d822ef98c9f8f3a4ec40d0fd4

See more details on using hashes here.

File details

Details for the file facebook_scraper-0.2.1a0-py3-none-any.whl.

File metadata

  • Download URL: facebook_scraper-0.2.1a0-py3-none-any.whl
  • Upload date:
  • Size: 12.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.0.5 CPython/3.8.2 Linux/5.4.0-7626-generic

File hashes

Hashes for facebook_scraper-0.2.1a0-py3-none-any.whl
Algorithm Hash digest
SHA256 eb5c94e138f02fcce656d6e72a688061390e4d27e402bc7658d0b83b6c392a7f
MD5 1c85f85a7c54ede37201515b3d02f06b
BLAKE2b-256 6f7a8d9e5adf07ad48b7b8a00e0459fc98e513e62aea933a5ca6d08f57d5836d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page