This script will not help you to configure collections. You have to read manual about every collection and configure it.
fcpc-index-data.sh is distributed with platform. This is script which helps you to run more collections, create epochs and run all required jobs.
Configuration is within home user directory, it is environment based. By default, it uses ~/.profile.
See this help, directly from script
FCPC_BIN=${FCPC_BIN:-fcpc} #fcpc binary to use
FCPC_OPTIONS=${FCP_OPTIONS:-""} #Manual options to add
FCP_TMP=${FCP_TMP:-~/.fcpc/data} #Directory used to download data before indexing
FCP_RM_TMP=${FCP_RM_TMP:-yes} #Automaticaly cleanup TMP
FCP_COLLECTIONS=${FCP_COLLECTIONS:-""} #List of collections to run. You can append number of jobs per collection by adding number in brackets. Like nmap[3]
FCP_PARALLEL=${FCP_FC_PARALLEL:-"0"} #How many parallel collections to run
FCP_JOBS=${FCP_JOBS:-"all,-ruler,-dummy,-linktagger"} #List of jobs to execute
FCP_AUTO_UPDATE_DB=${FCP_AUTO_UPDATE_DB:-"no"} #If set to yes, DB will be updated automatically
FCP_AUTO_EPOCH=${FCP_AUTO_EPOCH:-daily} #Can be set to no (no epoch creation), daily (YYYY_MM_DD), weekly (YYYY_MM_DD) or monthly (YYYY_MM)
FCP_COMMIT_EPOCH=${FCP_COMMIT_EPOCH:-yes} #If set to yes, epoch will be committed at the end
FCP_VIEWS=${FCP_VIEWS:""} #Set to list of views to execute
FCP_RABBITMQ=${FCP_RABBITMQ:""} #If se to yes, use RabbitMQ
FCP_SEPARATE_CONSUMERS=${FCP_SEPARATE_CONSUMERS:""} #If set to number, run separate consumers proces
FCP_RUN_COLLECTIONS=${FCP_RUN_COLLECTIONS:-yes} #If set to no, do not run collections
FCP_RUN_JOBS=${FCP_RUN_JOBS:-yes} #If set to no, do not run jobs
FCP_RUN_VIEWS=${FCP_RUN_VIEWS:-yes} #If set to no, do not run jobs
FCP_RUN_REPORTS=${FCP_RUN_REPORTS:-yes} #If set to no, do not run reports
FCP_RUN_EXPORT=${FCP_RUN_EXPORT:-""} #If set to name of the file, database will be exported to this file after commit
FCP_DRY=${FCP_DRY:-""} #If set to yes, script will just show what to do
INDEX_LOG=${FCP_LOG:"fcpc-index-data-$$.log"} #Where to save log
# Functions used
# FCP_onerror() If set, it is called when any error occurs
# FCP_pre_collections() If set, it is run before collections
# FCP_post_collections() If set, it is run after collections
# FCP_post_jobs() If set, it is run after jobs
# FCP_post_views() If set, it is run after views
# FCP_pre_report() If set, it is run before report
# FCP_report() If set, it is run for reporting before commit
# FCP_pre_commit() If set, it is call just before commit
# FCP_post_commit() If set, it is call just after commit (Epoch is readonly during this)
You can use this example. It will automatically update DB if there is new version of platform,
it will run these collections and run all required jobs. Next, it defines function which will be run before collections.
export FCP_AUTO_UPDATE_DB=yes
export FCP_COLLECTIONS="qualys azuread skybox zabbix itop ansible shodan openvas dns"
FCP_pre_collections() {
echo "Do something useful before collections"
}
By default, script logs to stdout and stderr. If you want to run from cron, use something like
fcpc-index-data.sh 2>&1 >~/fcpc-index.log
Logs are even saved into actual epoch by using fcpc history index-log.
You can use this command to feed all data from stdin to epoch:
This is automatically covered by script.
fcpc history index-log
fcpc history cat-log
You can use epoch status command to see actual status of epoch, its collections and jobs.
fcpc epoch status
# Epoch Epoch/d2023_04_18 status:
- tag:
- created: 2023-04-18 16:41:31
- objects: 10043
- links: 29706
- new: 0
- modified: 0
- deleted: 0
## Collection status:
### collection/qualys:download-and-import-all
- start: 2023-04-18 18:42:36
- end: 2023-04-18 18:56:54
- duration: 0:14:18
- modified: 0
- created: 0
- deleted: 0
### collection/azuread:download-and-import-all
- start: 2023-04-18 18:57:06
- Did not finish successfully yet
## Job status:
- No job started or finished
## Views status:
- No views started or finished
## Object stats:
object doc_count
------------------- -----------
qualyssoftware 4320
qualysl4interface 2566
qualyscve 1426
qualysvulnerability 838
qualyspatch 378
qualysl3interface 222
qualysip 160
history 77
qualyshost 38
qualysaccount 18
## Links stats:
object doc_count
----------- -----------
link/is-in 28193
link/refers 1513
This is automatically covered by script for common functions covered above.
You can even save information about external tasks into epoch. So you have evidence, if task was started, finished and how long it took.
External job started:
fcpc job external-start mycomplexjob
External job ended:
fcpc job external-end mycomplexjob
External job status
fcpc job external-status mycomplexjob
[6290/2023-04-18 19:16:14,536] fcp:ERROR [cli.py:770]
FCPC version 2.3.8.4f6da1fc, built 2023-03-06: Exiting with code 6: External job mycomplexjob was not found within current epoch history
You can automatically start reports if you define this
export FCP_RUN_REPORTS=yes
FCP_report() {
fcpc report generate-from-template template.html.jinja >report.html
}