Skip to main content
Skip table of contents

Installation of version 2.7.1

0) PREREQUIREMENTS: This step requires internet connection to install all required packages/dependencies for the installation.

CODE
## ILP WEBAPPS (where we want to install ILP):
sudo yum -y install epel-release
sudo yum -y groupinstall "Development Tools"
sudo yum install -y java-11-openjdk java-11-openjdk-devel libffi-devel bzip2-devel xz-devel openssl-devel bc pcre-devel systemd-devel chrony logrotate
sudo pip3 install kafka-python
## PERSISTENCE (where we installed persistence layer of veridiumid server):
# if is centos7 
sudo yum install bc -y
sudo pip3 install pyyaml jinja2 kafka-python
# if is rocky9
sudo yum install bc python3-pyyaml python3-jinja2 -y
sudo pip3 install kafka-python
## on the node where we start the installation (one of the uba-webapps):
sudo yum -y install python3-pip
sudo yum -y install python3-pyyaml python3-jinja2

1) Download ILP installer on the machine we want to start the installation (also requires internet connection to the veridiumID nexus repository).

Please check if you have enought space (df -h). The zip file has 5.8GB and uncompressed 7.9GB.

CODE
wget --user nexusUser --password nexusPassword https://veridium-repo.veridium-dev.com/repository/UBAInstallerOnPrem/2.7.1/uba-onprem-installer.zip
unzip uba-onprem-installer.zip

2) Configuring the domain certificate.

Connect to a webapp veridiumid-server node and copy the file from location /etc/veridiumid/haproxy/server.pem to this location (on the machine we want to start installation):

CODE
vi uba-onprem-installer/webapp/haproxy/server.pem

3) Generate a ssh key to do the installation:

CODE
##On the server, where the installation is started, generate a ssh key and copy it to all servers 
# (for the installation user and for the veridiumid user for persistence nodes only):
ssh-keygen
cat .ssh/id_rsa.pub
vi .ssh/authorized_keys

4) Configure variable file (only modified the following values):

CODE
vi uba-onprem-installer/variables.yaml
SSH_USER: the user for which you have generated the ssh key
WEBAPP_CONTACT_POINTS: IP1,IP2
PERSISTENCE_CONTACT_POINTS: IP3,IP4,IP5
ZOOKEEPER_SERVERS: IP3,IP4,IP5
#if the certifiate is for domain: *.dev17.veridium-dev.com, this should be the format in the document:
CLUSTERSUFFIX: dev17.veridium-dev.com
DOMAINSEPARATOR: "."
# Kafka Threshold Alert when uba_check_services is running
KAFKA_THRESHOLD_ALERT: 5
# uba-cannary at what peirod to be run in seconds
CANARY_RUN_PERIOD: "60"
# uba-cannary max duration call in seconds
MAX_DURATION_CALL_SECONDS: "4"
# Example: uba-europe, will be included in the uba-cannary email alert the ID of the cluster
UBA_CLUSTERID: "uba-europe"
# Example: europe-1, will be included in the uba-cannary email alert the region of the cluster
UBA_REGION: "europe-1"
#Example myserver.domain.com, the smtp server used by uba-cannary to send email alerts
UBA_CANNARY_MAILSMTPHOST: "myserver.domain.com"
#Example: 587, SMTP AUTH port used by uba-cannary to send email alerts
UBA_CANNARY_MAILSMTPPORT: "587"
#Example: uba@domain.ro, SMTP USER used by uba-cannary to send email alerts
UBA_CANNARY_MAILSMTPAUTHUSER: "uba@domain.ro"
#Example mystrongpassword, SMTP Password used by uba-cannary to send email alerts
UBA_CANNARY_MAILSMTPAUTHPWD: "mystrongpassword" used by uba-cannary to send email alerts
#Example: uba@domain.com, From where the email will be sent used by uba-cannary to send email alerts
UBA_CANNARY_MAILFROM: "uba@domain.com"
#Example: Where the email will be sent: noc@domain.com by uba-cannary
UBA_CANNARY_MAILTO: "noc@domain.com"
TIMEZONE: "Europe/Berlin"
UBA_VERSION: "2.7.1"
# Login to any existing veridiumid-server node and get the details from: /etc/veridiumid/zookeeper.properties
# ZOOKEEPER_USERNAME is username
# ZOOKEEPER_PASSWORD is password
# ZOOKEEPER_ENCRYPT_SALT is encrypt.salt
ZOOKEEPER_USERNAME: "veridiumid"
ZOOKEEPER_PASSWORD: "OUFYKCYBeXLSGi"
ZOOKEEPER_ENCRYPT_SALT: "zHhYMrnOENOorZ"

5) Start the installation process:

CODE
cd ./uba-onprem-installer
# check if the prereq are installed
./check_prereqs_rhel9.sh -r -w WEBAPP_IPS -p PERSISTENCE_IPS
# example: ./check_prereqs_rhel9.sh -r -w 10.56.129.11,10.56.254.7 -p 10.56.242.152,10.56.234.212,10.56.135.193
# start the installation process
./uba-installer-rhel9.sh


6) Generate a tenant for veridiumid-server, with a random uuid (ONE TIME).

The command bellow initialise the tenantId 79257e79-ae13-4d3d-9be3-5970894ba386, you can use another UUID and replace it in the command if you want:

CODE
# generate another random uuid if you want or use 79257e79-ae13-4d3d-9be3-5970894ba386
uuidgen # it will output a UUIDv4 like 88eb726f-4bee-4fe5-836c-b9b634c34e4c to be used as tenantId
# use the tenantId as parameter for the following script (if case of non-cdcr deployments)
cd ./uba-onprem-installer
./generate_tenant_platform.sh TENANT_ID 
# use the tenantId as parameter for the following script (if case of cdcr deployments)
cd ./uba-onprem-installer
./generate_tenant_platform_cdcr.sh TENANT_ID

To test if the initialisation was successfully, go to a persistence-node, in cqlsh and check the following tables if they have data:

use uba;

select * from tenants;

# should contain one entry, the tenant we registered

select * from global_model_latest_with_tenant;

# should contain one entry, the global context model

select count(1) from features_ordered_by_time;

# should contain 900+ entries, wait until the count doesn’t change then start doing authentications

  1. Generate a tenant for UBA canary service (ONE TIME - optional): (TODO script)

CODE
# Remove "/tmp/cli.log" as root/user with sudo permission and restart uba-cli-server service: 
sudo rm -rf /tmp/cli.log
sudo systemctl restart uba-cli-server
# As user "veridiumid" run the following commands:
cd ./uba-onprem-installer
./generate_tenant_cannary.sh

To test if the initialisation was successfully, go to a persistence-node, in cqlsh and check the following tables if they have data:

use uba;

select * from tenants; # should contain two entry, the tenants we registered

select * from global_model_latest_with_tenant;

# should contain two entries, the global context model and uba-cannary

select count(*) from features_ordered_by_time where tenant_id=8a2b6534-4506-4120-94af-329370b02f68 allow filtering;

# should contain 1800+ entries, wait until the count doesn’t change then start doing authentications

Note: uba-cannary needs to run only on one server:

CODE
sudo systemctl start uba-cannary
sudo systemctl status uba-cannary
Check the log file: /var/log/veridiumid/uba/uba-cannary.log

based on variable “CANARY_RUN_PERIOD“ set from “/etc/default/veridiumid/uba_variables”, if the value is 60 will be: 60 seconds x 11 authentication = 11 minutes.

and we will see SCORE:

CODE
...truncate...
INFO  com.veridiumid.uba.v2.cannary.MotionContextClientService:350 - Context result answer: ACCEPT confidence HIGH score 0.25694302675560166
...truncate...
INFO  com.veridiumid.uba.v2.cannary.MotionContextClientService:291 - UBA score: 0.7028731166294203 for stage ID 4f374442-9e8d-4687-94fb-c8918a24c2a5
...truncate...

 

  1. Configure the integration of veridiumid-server with ILP cluster:

You need to configure the following entries in the main load-balancer to balances traffic to the two ILP webapp machines. Example configuration for a HAProxy balancer:

CODE
frontend uba_webapp_443
    bind *:443
    mode tcp
    tcp-request inspect-delay 5s
    tcp-request content accept if { req_ssl_hello_type 1 }
    use_backend backend_uba
backend backend_uba
    mode tcp
    balance leastconn
    stick match src
    stick-table type ip size 1m expire 1h
    option ssl-hello-chk
    option tcp-check
    tcp-check connect port 443
    server webappserver1 10.56.241.64:443 check id 1
    server webappserver2 10.56.207.206:443 check id 2

Where 10.56.241.64 is the IP of ILP machine1 and 10.56.207.206 is the IP of ILP machine2.

On the webapp machines of veridiumid-server (on each machine), we need to add the following lines in /etc/hosts file:

CODE
## edit /etc/hosts
10.70.0.50 tenant.ilpdevelop.veridium-dev.com
10.70.0.50 users.ilpdevelop.veridium-dev.com
10.70.0.50 ingestion.ilpdevelop.veridium-dev.com
10.70.0.50 opentelemetry-collector.ilpdevelop.veridium-dev.com

Where 10.70.0.50 is the IP of the load-balancer from the previous step.

On the webapp machines of veridiumid-server (on each machine), we need to add the following lines in /opt/veridiumid/tomcat/bin/setenv.sh file:

CODE
### go in setenv in tomcat and add the following
TRACING_AGGREGATE_SPAN="true"

After that, you need to restart tomcat on both machine:

CODE
sudo service ver_tomcat restart
# wait for the servers to restart succesfully

 

9) Integration the ILP with VeridiumID application

9.1) Login to WebSecAdmin go to Settings → UBA Settings:

  • Enabled: (ON)

  • UBA CLUSTER SUFFIX: in our case will be “CLUSTERSUFFIX" from variables.yaml

  • UBA Subdomain Separator: in our case will be “DOMAINSEPARATOR“ from variables.yaml

  • Tenant Id*: in our case will be tenant id “79257e79-ae13-4d3d-9be3-5970894ba386" or the uuid you generated in step 6.

  • Use Compact Inference: (ON)

Save the configuration.

 

9.2) Go to Settings → Advanced → mobileSettings.json

  • Click on Format document (button from the right)

  • Replace value from is-uba-enabled with “true“

9.3) Check in the journey you are using if uba_command_motion and uba_command_context are enabled.

9.4) Configure proxy (if you are using one - OPTIONAL) to maintain the traffic internally (where ilpdevelop.veridium-dev.com is the domain you are using for ILP)

CODE
## go in websecadmin and add in proxy-config.json, such an entry, to keep the traffic internally
"nonProxyHttpsHosts": "localhost|ilpdevelop.veridium-dev.com|api.twilio.com|*.ilpdevelop.veridium-dev.com"

9.5) Go to SSP Login Page and do 11 logins and you will see score for Motion / Content in Activity. After 4 authentications you should receive a context score, and after 11 authentications you should receive a motion score as well.

 

Troubleshooting and script commands:

CODE
#run the following command to see if everything  running (all services without uba-cannary):
uba_check_services
# ILP WebAPP
# to see service status on webapp
uba_check_services
uba_check_kafka
# to stop/start services
uba_stop
uba_start
# VeridiumID Persistence
# to see status on persistence, for ILP services:
uba_check_services
uba_check_kafka
# to stop uba specific services, just kafka
systemctl stop uba-kafka
systemctl start uba-kafka
# to check logs of the platform:
less/vim/cat/tail /var/log/veridiumid/uba/log_file_of_service.log # where log_file_of_service is the file you want to see.
JavaScript errors detected

Please note, these errors can depend on your browser setup.

If this problem persists, please contact our support.