- Clone this repository to an appropriate location (eg
~/Dev
) - You will need a working python3 environment with
pip3 install requests
- The beats-keystore executable needs to be available on your env
$PATH
. For development usage, it will first check the project path'sbin
directory.
(Recommended) Download the precompiled binary for your OS architecture and place in ~/bin
(Compile with Go)
-
This assumes that you have a working golang environment
-
You may need to Go Get the necessary dependencies (eg.
go get -u golang.org/x/crypto/pbkdf2
) -
build binary on $PATH:
cd ~/bin; go build /path/to/project/bin/beats-keystore.go
-
Dev, build in project bin:
cd /path/to/project/bin; go build beats-keystore.go
- The script will prompt for a Elastic Cloud credentials and create a cluster called
support-ece-diagnostic
in a region of your choice. It will re-use this cluster unless it has been deleted. - Change directory to the root folder of the ECE diagnostic, run
python3 ~/Dev/ece-diag-processor/ece-diag-processor.py
.
This is a work in progress.
Tested with filebeat-6.5.4 and Elastic Cloud / 6.6.0 elasticsearch cluster.
Renamemetadata
->@metadata
. This would display nicer in Kibana Discover (due to the key sorting)metadata.file
does not need to include the diagnostic name. This has already been extracted tometadata.diag_name
- Proxy Logs Mapping (done, add lowercasing and path analyzers)
- Elasticsearch Logs Mapping
- Services Logs Mapping
- Verify Timezones are properly handled for logs
- Missing logs and data:
- Kibana is missing the bootlog.
- Need to simplify picking up all files. Maybe some conditionals in the ingest pipeline, and unknown files could be dumped to catchall index.
- Some data may require preprocessing (system commands / docker container logs / inspect json)
- Older version of the diagnostic did not have the same folder structure. Need to evaluate if this could be corrected easily. Also need to handle json logging formats.
- I'm considering a custom beat, which would allow the preprocessing & logic. This would allow prompting for credentials and automatically creating the destination cluster (and configuring) in Elastic Cloud. Idea would be to print the relevant information to the console, and provide a console based progress bar. Could potentially write the progress data to elasticsearch as well, but need to evaluate how to conditionally display the uploading status in Kibana.
- Dashboards and Visualizations
- Find common issues in the data (run watcher?)