Skip to content

sajid2045/diskover

 
 

Repository files navigation

diskover - File system crawler, disk space usage, storage search engine and analytics powered by Elasticsearch

License Release Sponsor Patreon Donate PayPal

diskover

diskover is an open source file system crawler and disk space usage software that uses Elasticsearch to index and manage data across heterogeneous storage systems. Using diskover, you are able to more effectively search and organize files and system administrators are able to manage storage infrastructure, efficiently provision storage, monitor and report on storage use, and effectively make decisions about new infrastructure purchases.

As the amount of file data generated by business' continues to expand, the stress on expensive storage infrastructure, users and system administrators, and IT budgets continues to grow.

Using diskover, users can identify old and unused files and give better insights into data change, file duplication and wasted space. diskover supports crawling local file-systems or over NFS/SMB. Amazon S3 inventory files are also supported.

diskover is written and maintained by Chris Park (shirosai) and runs on Linux and OS X/macOS using Python 2/3.

This is the first tool I've found that can index 7m files/2m directories in under 20 min

-- linuxserver.io community member
diskover diagram

Screenshots

diskover crawler and workerbots running in terminal
diskover crawler diskover worker bot
diskover-web (diskover's web file manager, analytics app, file system search engine, rest-api)
diskover-web dashboard diskover-web file tree diskover-web advanced search diskover-web tags
Kibana dashboards/saved searches/visualizations and support for Gource
kibana-screenshot diskover-gource

diskover Gource videos

gource video gource video

Sponsor diskover for as little as $1 / month.

If you are a fan of the project or you are using diskover and it's helping you save storage space and want to see it keep advancing, please consider sponsoring the project on Patreon or PayPal. Thank you so much to all the users and supporters!

Installation Guide

Requirements

  • Linux or OS X/macOS (tested on OS X 10.11.6, Ubuntu 16.04/18.04)
  • Python 2.7. or Python 3.5./3.6. (tested on Python 2.7.15, 3.6.5) Python 3 recommended.
  • Python elasticsearch client module
  • Python requests module
  • Python scandir module
  • Python progressbar2 module
  • Python redis module
  • Python rq module
  • Elasticsearch 5 (local or AWS ES Service, tested on Elasticsearch 5.6.9) Elasticsearch 6 is not supported yet.
  • Redis (tested on 4.0.8)

Optional Installs

  • diskover-web (diskover's web file manager and analytics app)
  • Redis RQ Dashboard (for monitoring redis queue)
  • netdata (for realtime monitoring cpu/disk/mem/network/elasticsearch/redis/etc metrics, plugin for rq-dashboard in netdata directory)
  • Kibana (for visualizing Elasticsearch data, tested on Kibana 5.6.9)
  • X-Pack (Kibana plugin for graphs, reports, monitoring and http auth)
  • crontab-ui (web ui for managing cron jobs - for scheduling crawls)
  • cronkeep (alternative web ui for managing cron jobs)
  • Gource (for Gource visualizations of diskover Elasticsearch data, see videos above)
  • sharesniffer (for scanning your network for file shares and auto-mounting for crawls)
  • Python qumulo-api module (for using Qumulo storage api, --qumulo cli option, install with pip, Python 2.7. only)

Download

$ git clone https://github.com/shirosaidev/diskover.git
$ cd diskover

Download latest version

Getting Started

Install Python dependencies using pip.

$ pip install -r requirements.txt

Copy diskover config diskover.cfg.sample to diskover.cfg and edit for your environment.

Start diskover worker bots (a good number might be cores x 2) with:

$ cd /path/with/diskover
$ python diskover_worker_bot.py

Worker bots can be added during a crawl to help with the queue. To run a worker bot in burst mode (quit after all jobs done), use the -b flag. If the queue is empty these bots will die, so use rq info or rq-dashboard to see if they are running.

To start up multiple bots, run:

$ cd /path/with/diskover
$ sh diskover-bot-launcher.sh

By default, this will start up 8 bots. See -h for cli options including changing the number of bots to start. Bots can be run on the same host as the diskover.py crawler or multiple hosts in the network as long as they have the same nfs/cifs mountpoint as rootdir (-d path) and can connect to ES and Redis (see wiki for more info).

Usage examples

Start diskover main job dispatcher and file tree crawler with (using adaptive batch size and optimize index cli flags):

$ python /path/to/diskover.py -d /rootpath/you/want/to/crawl -i diskover-indexname -a -O

Defaults for crawl with no flags is to index from . (current directory) and files >0 Bytes and 0 days modified time. Empty files and directores are skipped (unless you use -s 0 and -e flags). Max crawl depth is 100 and max depth for dir size calculations is 100 by default. Use -h to see cli options.

Crawl down to maximum tree depth of 3 and only calculate dir size/items to level 3:

$ python diskover.py -i diskover-indexname -a -d /rootpath/to/crawl -M 3 -c 3

Only index files which are >90 days modified time and >1 KB filesize:

$ python diskover.py -i diskover-indexname -a -d /rootpath/to/crawl -m 90 -s 1024

Import Amazon S3 Inventory file(s) (gzipped csv) with:

$ python /path/to/diskover.py -i diskover_s3-indexname --s3 /dir/inventoryfile1.csv.gz

OVA image file (for vmware, etc)

Becoming a Patron gets you access to the OVA files running the latest version of diskover/diskover-web. Fastest way to get up and running diskover. Check out the Patreon page to learn more about how to get access to the OVA downloads.

Docker

You can set up diskover and diskover-web in docker, there are a few choices for easily running diskover in docker using pre-built images/compose files.

diskover-web has Dockerfile with instructions for docker-compose.

linuxserver.io Docker hub images:

coming soon... to help with testing head over to linuxserver.io Discord and direct message @hackerman.

User Guide

Read the wiki for more documentation on how to use diskover.

Discussions/Questions

For discussions or questions about diskover, please ask on Google Group.

Bugs

For bugs about diskover, please use the issues page.

License

See the license file.

About

File system crawler, disk space usage, file search engine and file system analytics powered by Elasticsearch

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 95.7%
  • Shell 4.3%