8000 GitHub - tyto-sec/BugcrowdScraper: Scraping tool for Bugcrowd engagement scopes. · GitHub
[go: up one dir, main page]

Skip to content

tyto-sec/BugcrowdScraper

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Bugcrowd Scraper

BugcrowdScraper

last commit created language stars

Bugcrowd Scraper is a Python tool that collects engagement data and scope targets from Bugcrowd and exports structured items using Scrapy feeds.


Features

  • Collects engagements from the Bugcrowd public listings endpoint
  • Follows each engagement brief to fetch scope details
  • Automatic pagination across engagement pages
  • Rotating proxy support via proxies.txt
  • Deploy and run with Scrapyd + ScrapydWeb

Requirements

  • Python 3.11+
  • Scrapyd and Scrapyd Client
  • ScrapydWeb and Logparser (optional, recommended)

Installation

pip install -r requirements.txt

Configuration

proxies.txt

Add proxies in the following format:

username:password@ip:port

Deploy with Scrapyd

Install deploy dependencies:

pip install scrapyd scrapyd-client

Start Scrapyd:

scrapyd

Deploy the project:

scrapyd-client deploy

ScrapyWeb (optional)

Install the web panel:

pip install scrapydweb logparser

Start the services:

scrapyd &
scrapydweb &
logparser &

Scrapyd API

  • POST /schedule.json → Run spider
  • GET /listspiders.json → List spiders
  • GET /listprojects.json → Projects
  • GET /listjobs.json → Active jobs
  • POST /cancel.json → Stop job

Run

To schedule the spider via API:

curl http://localhost:6800/schedule.json -d "project=bugcrowdscraper" -d "spider=engagementspider"

Output

Items are exported according to the FEEDS setting in Scrapy or via spider run parameters.

About

Scraping tool for Bugcrowd engagement scopes.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages

0