Upgrade VeridiumID from 3.7.x / 3.8.x to v3.8.4
🔧 Pre-Upgrade Preparation
Backup & Snapshot
Take VM snapshots or backups of all nodes before starting.
Identify node roles:
WEBAPP node: is the node where webservices are running
PERSISTENCE node: is the node where Cassandra is running.
Based on the OS version, please download the necessary packages.
cat /etc/redhat-release
Packages
Package URL | MD5 | SHA1 | Description |
|
| VeridiumID Update packages archive containing all RPMs, for local update procedure RHEL8 | |
|
| VeridiumID Update packages archive containing all RPMs, for local update procedure RHEL9 |
Upgrade Methods
You can use either use:
A. Local Packages
Download and extract the package
CODEexport https_proxy=PROXY_IP:PROXY_PORT ## RHEL8, Rocky8 export https_proxy=PROXY_IP:PROXY_PORT wget --user NEXUS_USER --password NEXUS_PASSWORD \ https://veridium-repo.veridium-dev.com/repository/VeridiumUtils/Veridium-3.8.4-update/veridiumid-update-packages-rhel8-12.4.72.zip TMP_DEST="/home/veridiumid/update384" unzip veridiumid-update-packages-rhel8-12.4.72.zip -d ${TMP_DEST} ## RHEL9, Rocky9 export https_proxy=PROXY_IP:PROXY_PORT wget --user NEXUS_USER --password NEXUS_PASSWORD \ https://veridium-repo.veridium-dev.com/repository/VeridiumUtils/Veridium-3.8.4-update/veridiumid-update-packages-rhel9-12.4.72.zip TMP_DEST="/home/veridiumid/update384" unzip veridiumid-update-packages-rhel9-12.4.72.zip -d ${TMP_DEST}Run update commands (on WEBAPP first, then PERSISTENCE nodes):
CODEsudo yum localinstall -y ${TMP_DEST}/packages/veridiumid_update_procedure-12.4.72-20260219.x86_64.rpm sudo python3 /etc/veridiumid/update-procedure/current/preUpdateSteps.py --version 12.4.72 --rpm-path ${TMP_DEST}/packages/ sudo python3 /etc/veridiumid/update-procedure/current/startUpdate.py --version 12.4.72 --rpm-path ${TMP_DEST}/packages/ sudo bash /etc/veridiumid/scripts/check_services.sh
B. YUM Repository
Ensure the repository has the package:
CODEsudo yum list available veridiumid_update_procedure-12.4.72-20260219Run update (one node at a time):
CODEsudo yum clean metadata sudo yum install -y veridiumid_update_procedure-12.4.72 sudo python3 /etc/veridiumid/update-procedure/current/preUpdateSteps.py --version 12.4.72 --use-repo sudo python3 /etc/veridiumid/update-procedure/current/startUpdate.py --version 12.4.72 --use-repo sudo bash /etc/veridiumid/scripts/check_services.sh
Post-Upgrade Steps
1. Data Migration to ELK
Run once on a Persistence node, only if updating from versions older then 3.8.1:
sudo bash /opt/veridiumid/migration/bin/migrate_to_elk.sh
2. Cassandra Upgrade (if 4.0.9 or v4.1.4) → 5.0.2
Check version on a Persistence node:
/opt/veridiumid/cassandra/bin/nodetool describecluster | grep -A1 "Database versions"
## if the version is 4.0.9 or 4.1.4, than update should be executed; the proper version is 5.0.2
Then upgrade all Persistences node (one node at a time):
If update is done with local packages:
##check status - all nodes should be up - the status "UN" should be for everynode
/opt/veridiumid/cassandra/bin/nodetool describecluster
/opt/veridiumid/cassandra/bin/nodetool status
TMP_DEST="/home/veridiumid/update384"
sudo bash /etc/veridiumid/update-procedure/current/resources/scripts/372/update_cassandra.sh ${TMP_DEST}/packages/
##check status - all nodes should be up again, in the cluster - the status "UN" should be for everynode
sudo /opt/veridiumid/cassandra/bin/nodetool status
sudo /opt/veridiumid/cassandra/bin/nodetool describecluster
If update is done with YUM repository:
##check status - all nodes should be up - the status "UN" should be for everynode
/opt/veridiumid/cassandra/bin/nodetool describecluster
/opt/veridiumid/cassandra/bin/nodetool status
## run on everynode
/opt/veridiumid/cassandra/bin/nodetool describecluster
sudo bash /etc/veridiumid/update-procedure/current/resources/scripts/372/update_cassandra.sh
##check status - all nodes should be up again, in the cluster - the status "UN" should be for everynode
sudo /opt/veridiumid/cassandra/bin/nodetool status
sudo /opt/veridiumid/cassandra/bin/nodetool describecluster
3. Create one Zookeeper Cluster and enable Read Only mode (Optional)
In Single DC implementation on Persistence nodes, only if updating from versions 3.7.X :
sudo bash /etc/veridiumid/update-procedure/current/resources/scripts/372/update_zookeeper_configuration.sh
In case of CDCR, run the following procedure, to create one big cluster, with nodes from both datacenters. Previous command should not be executed in case of CDCR.
Before starting this configuration, make sure that you have connectivity on ports 2888 and 3888 between ALL persistence nodes.
To test the connectivity run the following commands:
on DC1:
nc -zv IPNODEDC2 2888
nc -zv IPNODEDC2 3888
## In second datacenter stop all webapp services and zookeeper services
ver_stop_webapp
service ver_zookeeper stop
## run this command on primary datacenter on one node in webapp. This generates a file DC1.tar.gz
sudo bash /etc/veridiumid/scripts/veridiumid_cdcr.sh -i
## copy the DC1.tar.gz to all nodes - webapp and persistence in both datacenters.
## run this command on all persistence nodes in both datacenters starting with the first one -
## the script will create a large cluster containing the Zookeeper nodes in both datacenters and remove data from second DC
ARCH_PATH=/tmp/DC1.tar.gz
sudo bash /etc/veridiumid/scripts/veridiumid_cdcr.sh -c -e -a ${ARCH_PATH}
## run this command on all webapp nodes in second datacenter
ARCH_PATH=/tmp/DC1.tar.gz
sudo bash /etc/veridiumid/scripts/veridiumid_cdcr.sh -c -e -a ${ARCH_PATH}
4. ELK Stack Upgrade to 8.17.3 (Required)
Upgrade Elasticsearch, Kibana, and Filebeat (first persistence, then webapp), in case that it was not already updated.
Check version on Persistence nodes:
## check if version is now 8.17.3; if not, run the update procedure; if it is version 8.17.3, do nothing
sudo /opt/veridiumid/elasticsearch/bin/elasticsearch --version
If update is done using local packages
Run on all nodes, first persistence then webapp nodes:
## run below command on all nodes, first on persistance and then on webapp, one by one.
TMP_DEST="/home/veridiumid/update384"
sudo bash /etc/veridiumid/update-procedure/current/resources/scripts/380/update_elk_stack.sh ${TMP_DEST}/packages/
eops -l
## it is ok if the cluster is in red status
After all persistance nodes are updated to version 8.17.3, run the following command on all nodes (persistence + webapp) starting with persistence nodes
sudo bash /etc/veridiumid/update-procedure/current/resources/scripts/380/update_elk_stack.sh post
##check elasticsearch cluster status that is in green status.
check_services
Run only on one node, on webapp:
##run on one node:
/opt/veridiumid/migration/bin/elk_ops.sh --update-settings
If update is done using YUM repository
Run on all nodes, first persistence then webapp nodes:
sudo bash /etc/veridiumid/update-procedure/current/resources/scripts/380/update_elk_stack.sh
## here the elasticsearch cluster might be red
After all persistance nodes are updated to version 8.17.3, run the following command on all nodes (persistence + webapp) starting with persistence nodes
sudo bash /etc/veridiumid/update-procedure/current/resources/scripts/380/update_elk_stack.sh post
##check elasticsearch cluster status that is in green status.
Run only on one node, on webapp:
##run on one node:
/opt/veridiumid/migration/bin/elk_ops.sh --update-settings
🌐 Useful Repositories
RHEL8: https://veridium-repo.veridium-dev.com/repository/VeridiumRPM8/packages/
RHEL9: https://veridium-repo.veridium-dev.com/repository/VeridiumRPM9/packages/
RHEL8 MD5 of each package:
Package URL | MD5 | SHA1 | Description |
|---|---|---|---|
e7abb611d66e7953e107788496f20896 | 06c112cf2d6255f214d70d0f6857f16af2589cbf | VeridiumID Admin Dashboard | |
577613f879ae2a6a4560191510e05d97 | 2675e2b04001705224ae0caaf74fabe60b9ca7ae | VeridiumID migration tool | |
22aae104e971e0dcd5b93cf05bd0c15f | 0d74fb60800c609777a1c9c7b7c9b89b130a4062 | VeridiumID Websec | |
52ba6a9cd582a855e4e132d5c7b11967 | 380ab958747d53beed1ca8a92df52ac77d077927 | VeridiumID Directory Service component | |
3948f320f14e46d1c36d05b420cee06f | bd98acdb1b513915c31a245120b46ccea46a3c6e | VeridiumID DMZ service | |
79ca27f16c5dd3eac4bf8f4f33b40848 | 042e62e04978949cd479d029cdbda648a74fe0cc | VeridiumID Fido service | |
b2e3987250707e59aceb4598e3aca863 | 41751e6eb5a7295444bc21b3bf9d3d8a95e84ec9 | VeridiumID Open Policy Agent | |
c8f8b40b054013b3d396a8601685fe73 | b393354e49dbaece35e0dad8e5259a19f6a30de7 | VeridiumID Elasticsearch | |
a390c09cbbb187ba49382307810e531b | 57925dfa1f156fe4692d8845d2fa987f2705c205 | VeridiumID Kibana | |
1a25f3fa1b45b58849671e60d55b1f81 | 2bcc329c123d8a32e4674e4b4bf36d23f0801771 | VeridiumID Zookeeper | |
6b8a973e55780192e9f41fdc0cf95863 | ff26ac3f9bba6f4a8bef8a09391cfe3e52be931e | VeridiumID Cassandra | |
20b5900e4e9c47e74913f741e44c8ab2 | 2e503481c6a0f965ca9533612f34ddf5ffa41a50 | VeridiumID Haproxy | |
25821fa56306b3ceec716b44537eca0a | 74cb91342a83113429352271d5d28f5566fe2b9f | VeridiumID Self Service Portal | |
b8871fa0debc81926aa4d398e712fc59 | 806ab30b1c9c53d2836949948a0fff0c93ba5adc | VeridiumID Shibboleth Identity Provider | |
274dfc81aaad20c76d90b623ad84cf45 | c81e4039a2ab8dc38110544564df77c418695e70 | VeridiumID Tomcat | |
2b00345bbcab542315c37744e7f9258d | 0fbad784a4dd2d6cf033266a28a3d02cc6e22844 | VeridiumID Setupagent | |
ddbb0c3d339ee72c962b283f47ea34b0 | 76a3f9d1b52a48da95d17f97be07bc10616105b2 | VeridiumID FreeRadius | |
ed91989a6fe263c729478b53310c11b5 | 210e62ad9d00734e5cc0b4e11b2f0aca96eb994c | VeridiumID 4F biometric library | |
42582fa541d75d5cfae5dac39ab2b5fa | 214a332b5cd10e51809c0a7f40614a144afbd9ca | VeridiumID VFace biometric library | |
8712f93cdf43015004850908d00868c9 | 5d4375b718eb8d1f068881121cd54c1766a46a0f | Update scripts |
RHEL9 MD5 of each package:
Package URL | MD5 | SHA1 | Description |
|---|---|---|---|
218c0e534700335469b91082a5cbf0c0 | c640e44144c8fc0f37b8fcb567fe9366c6b5b17a | VeridiumID Admin Dashboard | |
166abad2eec5b78e6d030b8f13ececea | ff79f5c68d557c0b9c50d7350468ee89b34dfca8 | VeridiumID migration tool | |
b59f45da3a58ae17a40c391844ac650d | e29093e59ea73fa5013eed304e98dc937b4d49a9 | VeridiumID Websec | |
44909d8133426969a8dbd632cd607882 | a47d6d828af46427869358120b8092bf1effdd37 | VeridiumID Directory Service component | |
5f15a6d398eb14f0ee32a3c5c9a05d5e | 59a5cdc14c9a55aa0c57e3379269648175d00165 | VeridiumID DMZ service | |
eea4f288cd6ee6b971614f5a0da53baf | bd638dc81aae0ac22bb52e748074ca1c812284bd | VeridiumID Fido service | |
72c7150ce68f03525c675b7d32dcbbec | 255bf3fef877cc59e695a15a181a41961c304e4f | VeridiumID Open Policy Agent | |
e709d8b22bf9cd3748517e69157a5bba | f2ba888e1bb686d2d36fb39903474eb5b291c838 | VeridiumID Elasticsearch | |
1085f71ab38a5e373210958986ca6d0a | 7f1cc465031370573a124e93fc6c358ed7c88182 | VeridiumID Kibana | |
21be1081e47a659d6c76267c8cc612bc | f22e40fc87f19d3efaa1a5c20ffa076c7397392f | VeridiumID Zookeeper | |
66c370c98895ff1e06c2142626ebf0f7 | 95d01d080e709752b04ab1a81251413f396188b6 | VeridiumID Cassandra | |
3921b7a6776e4e47de56e98d2c9f294f | 5576a7b6aca0de281a802988f8bb8d16428c39ce | VeridiumID Haproxy | |
9d183ac49cb5cc4162968f313d0f8034 | dd6f169a2ef79c63b2979cb8de1a6abacbde5533 | VeridiumID Self Service Portal | |
af69d23b29ecf2baf1d70a0f0b9f34b0 | f0881697262efdc3586ab375011fecffe9466d18 | VeridiumID Shibboleth Identity Provider | |
dbdd892bc663a40fbbed1eb83557faab | 724a5dda9acb113b71bb53eda039f667e010be46 | VeridiumID Tomcat | |
a5fd6c1f6d849e9fff30630179e97f8d | 08bea02b0eb23b6da56eba27709da3ab99a8f327 | VeridiumID Setupagent | |
3fdaee11439c12e9620222a86bb60553 | 1eb78ecef844635cc6c296f5ad468b12782cc7d8 | VeridiumID FreeRadius | |
e1789a1eead3d5afff516633417ab225 | cb73d1b39dc377f08cd90d4a0e6458fc909f144a | VeridiumID 4F biometric library | |
91420e9cab76f30886b913812d3f8b5b | a2739a4c09c8dc8283a7e8e722497cb255b96d15 | VeridiumID VFace biometric library | |
f26c9bdf81d58df803acd80af92363cd | a9e0c8480224da44c95ba59fb5ee958f60caa014 | Update scripts |