_________ .__ _____ __ .__
/ _____/_ _ _|__|/ ____\/ |______ ____ __ __| | _____ _______
\_____ \\ \/ \/ / \ __\\ __\__ \ _/ ___\| | \ | \__ \\_ __ \
/ \\ /| || | | | / __ \\ \___| | / |__/ __ \| | \/
/_______ / \/\_/ |__||__| |__| (____ /\___ >____/|____(____ /__|
\/ \/ \/ \/
This repository will create a virtualized OpenStack Swift cluster using Vagrant, libvirt, Ansible.
Note this will start seven virtual machines on your computer.
# Clone swiftaucular repo
$ cd swiftacular
# Install prerequisites on the host
$ ./install_prereqs.sh
# Deploy Swift and monitoring dashboards
$ ./bootstrap_swift_with_monitoring.sh
VM | Swift Version | Status |
---|---|---|
CentOS Stream 9 | OpenStack stable/2025.1 | Supported |
Ubuntu 24.04 | OpenStack stable/2025.1 | WIP |
VM | Swift Version | Status |
---|---|---|
CentOS Stream 9 | OpenStack stable/2025.1 | Supported |
Ubuntu 24.04 | OpenStack stable/2025.1 | WIP |
By default, vagrant will use CentOS Stream 9:
$ vagrant up
To use Ubuntu 24.04 instead:
$ VM_BOX=ubuntu vagrant up
- Run OpenStack Swift in vms on your local computer, but with multiple servers
- Replication network is used, which means this could be a basis for a geo-replication system
- SSL - Keystone is configured to use SSL and the Swift Proxy is proxied by an SSL server
- Sparse files to back Swift disks
- Tests for uploading files into Swift
- Use of gauntlt attacks to verify installation
Minimal Host Requirements: CPU: 6 vCPUs RAM: 16 GB Disk: ~120 GB
Recommended Host Requirements: CPU: 16+ vCPUs RAM: 64+ GB Disk: 500 GB+ SSD (preferably NVMe)
Seven Vagrant-based virtual machines are used for this playbook:
- package_cache - One apt-cacher-ng server so that you don't have to download packages from the Internet over and over again, only once
- authentication - One Keystone server for authentication
- lbssl - One SSL termination server that will be used to proxy connections to the Swift Proxy server
- swift-proxy - One Swift proxy server
- swift-storage - Three Swift storage nodes
Each vm will have four networks (technically five including the Vagrant network). In a real production system every server would not need to be attached to every network, and in fact you would want to avoid that. In this case, they are all attached to every network.
- eth0 - Used by Vagrant
- eth1 - 192.168.100.0/24 - The "public" network that users would connect to
- eth2 - 10.0.10.0/24 - This is the network between the SSL terminator and the Swift Proxy
- eth3 - 10.0.20.0/24 - The local Swift internal network
- eth4 - 10.0.30.0/24 - The replication network which is a feature of OpenStack Swift starting with the Havana release
Because this playbook configures self-signed SSL certificates and by default the swift client will complain about that fact, either the --insecure
option needs to be used or alternatively the SWIFTCLIENT_INSECURE
environment variable can be set to true.
You can install the swift client anywhere that you have access to the SSL termination point and Keystone. So you could put it on your local laptop as well, probably with:
$ pip install python-swiftclient
However, I usually login to the package_cache server and use swift from there.
$ vagrant ssh swift-package-cache-01
vagrant@swift-package-cache-01:~$ . /vagrant/testrc
vagrant@swift-package-cache-01:~$ swift list
vagrant@swift-package-cache-01:~$ echo "swift is cool" > swift.txt
vagrant@swift-package-cache-01:~$ swift upload swifty swift.txt
swift.txt
vagrant@swift-package-cache-01:~$ swift list
swifty
vagrant@swift-package-cache-01:~$ swift list swifty
swift.txt
If you want to redo the installation there are a few ways.
To restart completely:
$ ./cleanup.sh
$ ./bootstrap_swift_with_monitoring.sh
There is a script to destroy and rebuild everything but the package cache:
$ ./bin/redo
$ ansible -m ping all # just to check if networking is up
$ ansible-playbook deploy_swift_cluster.yml
To remove and redo only the rings and fake/sparse disks without destroying any virtual machines:
$ ansible-playbook playbooks/remove_rings.yml
$ ansible-playbook deploy_swift_cluster.yml
To remove the keystone database and redo the endpoints, users, regions, etc:
$ ansible-playbook ./playbook/remove_keystone.yml
$ ansible-playbook deploy_swift_cluster.yml
To change debug level output of Vagrant
export VAGRANT_LOG=debug
- library/swift-ansible-modules/keystone_user
See the issues in the tracking tracking system on Github for Swiftacular.