Skip to main content

Package for extracting software repository metadata

Project description

Scraper

Scraper is a tool for scraping and visualizing open source data from various code hosting platforms, such as: GitHub.com, GitHub Enterprise, GitLab.com, hosted GitLab, and Bitbucket Server.

Getting Started: Code.gov

Code.gov is a newly launched website of the US Federal Government to allow the People to access metadata from the governments custom developed software. This site requires metadata to function, and this Python library can help with that!

To get started, you will need a GitHub Personal Auth Token to make requests to the GitHub API. This should be set in your environment or shell rc file with the name GITHUB_API_TOKEN:

    $ export GITHUB_API_TOKEN=XYZ

    $ echo "export GITHUB_API_TOKEN=XYZ" >> ~/.bashrc

Additionally, to perform the labor hours estimation, you will need to install cloc into your environment. This is typically done with a Package Manager such as npm or homebrew.

Then to generate a code.json file for your agency, you will need a config.json file to coordinate the platforms you will connect to and scrape data from. An example config file can be found in demo.json. Once you have your config file, you are ready to install and run the scraper!

    # Install Scraper from a local copy of this repository
    $ pip install -e .
    # OR
    # Install Scraper from PyPI
    $ pip install llnl-scraper

    # Run Scraper with your config file ``config.json``
    $ scraper --config config.json

A full example of the resulting code.json file can be found here.

Config File Options

The configuration file is a json file that specifies what repository platforms to pull projects from as well as some settings that can be used to override incomplete or inaccurate data returned via the scraping.

The basic structure is:

{
    // REQUIRED
    "contact_email": "...",  // Used when the contact email cannot be found otherwise

    // OPTIONAL
    "agency": "...",         // Your agency abbreviation here
    "organization": "...",   // The organization within the agency
    "permissions": { ... },  // Object containing default values for usageType and exemptionText

    // Platform configurations, described in more detail below
    "GitHub": [ ... ],
    "GitLab": [ ... ],
    "Bitbucket": [ ... ],
}
"GitHub": [
    {
        "url": "https://github.com",  // GitHub.com or GitHub Enterprise URL to inventory
        "token": null,                // Private token for accessing this GitHub instance
        "public_only": true,          // Only inventory public repositories

        "orgs": [ ... ],    // List of organizations to inventory
        "repos": [ ... ],   // List of single repositories to inventory
        "exclude": [ ... ]  // List of organizations / repositories to exclude from inventory
    }
],
"GitLab": [
    {
        "url": "https://gitlab.com",  // GitLab.com or hosted GitLab instance URL to inventory
        "token": null,                // Private token for accessing this GitHub instance
        "fetch_languages": false,     // Include individual calls to API for language metadata. Very slow, so defaults to false. (eg, for 191 projects on internal server, 5 seconds for False, 12 minutes, 38 seconds for True)

        "orgs": [ ... ],    // List of organizations to inventory
        "repos": [ ... ],   // List of single repositories to inventory
        "exclude": [ ... ]  // List of groups / repositories to exclude from inventory
    }
]
"Bitbucket": [
    {
        "url": "https://bitbucket.internal",  // Base URL for a Bitbucket Server instance
        "username": "",                       // Username to authenticate with
        "password": "",                       // Password to authenticate with
        "token": "",                          // Token to authenticate with, if supplied username and password are ignored

        "exclude": [ ... ]  // List of projects / repositories to exclude from inventory
    }
]
"TFS": [
    {
        "url": "https://tfs.internal",  // Base URL for a Team Foundation Server (TFS) or Visual Studio Team Services (VSTS) or Azure DevOps instance
        "token": null,                  // Private token for accessing this TFS instance

        "exclude": [ ... ]  // List of projects / repositories to exclude from inventory
    }
]

License

Scraper is released under an MIT license. For more details see the LICENSE file.

LLNL-CODE-705597

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llnl-scraper-0.10.0.tar.gz (25.2 kB view details)

Uploaded Source

Built Distribution

llnl_scraper-0.10.0-py3-none-any.whl (30.9 kB view details)

Uploaded Python 3

File details

Details for the file llnl-scraper-0.10.0.tar.gz.

File metadata

  • Download URL: llnl-scraper-0.10.0.tar.gz
  • Upload date:
  • Size: 25.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.7.0 requests/2.25.1 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.56.0 CPython/3.9.0

File hashes

Hashes for llnl-scraper-0.10.0.tar.gz
Algorithm Hash digest
SHA256 0b46d3d276988b5cadafeef17e1bdfa0dd6a0c6025360ace51bd6431e4fc26d3
MD5 898a4c6dd2c3b487a413b785570f7a7d
BLAKE2b-256 8f1f01aafb87114e92039b44edec2d2a5464c7e6c0d864a1c58d311cdc6657d6

See more details on using hashes here.

Provenance

File details

Details for the file llnl_scraper-0.10.0-py3-none-any.whl.

File metadata

  • Download URL: llnl_scraper-0.10.0-py3-none-any.whl
  • Upload date:
  • Size: 30.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.7.0 requests/2.25.1 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.56.0 CPython/3.9.0

File hashes

Hashes for llnl_scraper-0.10.0-py3-none-any.whl
Algorithm Hash digest
SHA256 624506562a4f890c2e365de7ffe099ed2d38d0b2626579b005f93d08fd688cbd
MD5 9119dbfd9adb582ba3d9ea5a703a617c
BLAKE2b-256 6fbf3bb888a356c246b6836e6439c0f507b313efbb7f1f61c2fad13577c21d43

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page