by WebKeyDesign | Sep 28, 2021 | Linux, Networking, Software
Last month I walked into my home office and heard the buzzing of a UPS. After switching it out with another smaller UPS, I wiped off the dust and found the model number on the bottom to be: BE550G. These older UPS models are no longer even supported by APC anymore. After doing a search online, I found BatteryPlus.com had a replacement battery and they have a store nearby. I ordered the Duracell Ultra 12V 9AH High Rate AGM SLA Battery with F2 Terminals [SLAHR12-9FR] and then picked it up the same day. After letting the battery charge overnight, I had to hook the UPS up to my Windows machine to set the Battery Date using the PowerChute software. For some reason this is not possible on other operating systems and open source software that I could find. Once I had this done, I moved the UPS over to my pfSense firewall and connected it directly to one of the USB ports on the firewall.
There are a couple of different packages for pfSense that you can install. pfSense is FreeBSD based, so you can install the software natively or use the pfSense packages to install. Once you configure the setup, the packages offer dashboard widgets that you can add to the pfSense dashboard. Here is what each one looks like.
Apcupsd
Developed for only APC UPS units, apcusd features a better looking widget.
Network UPS Tools
Known as the NUT package, this open source software has a more simplistic dashboard, however Network UPS Tools supports more devices and has extensive features for UPS units directly connected or on the network.
Additional Notes
Setting up either package requires reading the setup documentation online. I was able to run both packages for a direct USB connected device.
For apcupsd set UPS Cable and UPS Type to “USB” and leave the Device field blank. If you are using NUT, set the UPS Type to Local USB and driver to usbhid.
Overall I am glad that I could salvage the UPS and keep it in service. This keeps perfectly good equipment working and prevents waste. The plus, is that my firewall and internet connection will run a bit longer and not reset during a power spike.
by WebKeyDesign | May 14, 2020 | Linux, Webmastering
Some of us prefer the Red Hat flavor of Linux to the Debian and Ubuntu distros. Although CentOS is a very stable system, you do run into issues like any other operating system. Here is an issue that I encountered on my CentOS 8.1 web server, while trying to fix something else.
Modular Dependency Problems with Perl
Running a dnf check would result in the following messages.
#dnf check
Modular dependency problems:
Problem 1: conflicting requests
- nothing provides module(perl:5.26) needed by module perl-DBD-MySQL:4.046:8010020191114030811:073fa5fe-0.x86_64
Problem 2: conflicting requests
- nothing provides module(perl:5.26) needed by module perl-DBI:1.641:8010020191113222731:16b3ab4d-0.x86_64
The fix was to run the following command:
#yum module enable perl:5.26
After this, I reran the dnf check and the dependency problems were gone.
by WebKeyDesign | Jun 21, 2017 | Linux, Webmastering
My preferred Linux distro at the moment is CentOS 7. It is a community-supported distribution that follows the work that Red Hat does with Red Hat Enterprise Linux (RHEL). It is stable operating system for web servers and something I use every day. Unlike MacOS or Windows, which have lots of internet resources for support, Linux is a bit different. It inherits most of the UNIX terminology and documentation, and perhaps because of this, you will find it harder to research things. After a while, I started to make notes for myself and this post is the end result of some of that note taking. Many things in Linux are step oriented. For example, you should not install PHP, before you have Apache or some other web server installed. Below I have documented some of the steps I take after installing a bare minimum install of CentOS 7 without any GUI. I plan on revising this post in the future as I add or revise my post installation steps.
Note: It is assumed that you are aware that all administrative commands in Linux require root privileges, so I have left out the sudo part. Learn more about how to become root on the CentOS Wiki.
Post Installation Tasks:
1. Update System (Update YUM and Install Updates)
This will automatically update the system. The -y option will suppress any prompting to accept the changes.
yum -y update && yum -y upgrade
2. Enable Repositories
Before installing some packages in the next section, you will need to enable some repositories. The most common are EPEL, IUS, and Remi. Unlike Ubuntu, CentOS is a linux distro that caters to users interested in an enterprise platform. This means that CentOS chooses stability over newer updated software. The EPEL, IUS, and Remi repositories aim to bring newer versions of software to CentOS, without compromising the overall goal of stability. I leave it up to you to read about what repositories to enable and why. In this tutorial, we will enable EPEL and Remi.
Enable EPEL Repo:
To enable EPEL, just use the YUM command. If this command does not work, reference the EPEL Wiki for more information.
yum install epel-release
Enable Remi Repo:
The primary reason for enabling Remi is for testing out newer versions of PHP. You can reference the Remi site for more information, however if you primarily interested in PHP, it is better to use the Remi Configuration Wizard to learn about the various way you want to setup PHP. Note that PHP is also available through the IUS repository. For this tutorial we will install only PHP version 7.1 from Remi.
In order to install PHP from Remi, we must enable EPEL. We have already done this so we will skip the first step.
# EPEL already enabled #yum install https://dl.fedoraproject.org/pub/epel/epel-release-latest-7.noarch.rpm
yum install http://rpms.remirepo.net/enterprise/remi-release-7.rpm
yum install yum-utils
yum-config-manager --enable remi-php71
yum update
To install additional packages:
yum install php-xxx
We then can verify the PHP Version and PHP Extensions installed:
php --version
php --modules
3. Programs and Applications to Install
The following commands, applications, packages are useful to install on a new system. You can use yum to install each of them. Skip down to the YUM Commands section to learn more about yum commands. In Linux most anything installed by yum is called an package, however most Windows users are familiar with the terms application or program. I use the term application interchangeably with package.
To install a specific application/package…
yum install package-name
YUM Utils
These are required in order to remove old kernels and use YUM Plugins
yum install -y yum-utils
nano
Nano is an easy to use text editor for those of us who are not Vim or eMacs masters. I highly recommend using Nano if you are new to Linux.
net-tools
If you installed a minimal install of CentOS 7, the ifconfig command will be missing. Installing net-tools will install ifconfig. This is the equivalent of the ipconfig command in Windows.
nmap
Nmap is a utility for network exploration or security auditing. Once installed, use it to list all open ports and services.
nmap 127.0.0.1
rkhunter
Rootkit Hunter is an easy-to-use tool which checks computers running UNIX (clones) for the presence of rootkits and other unwanted tools.
rkhunter --check
telnet
Telnet is a popular protocol for logging into remote systems over the Internet. The package provides a command line Telnet client.
tree
The tree utility recursively displays the contents of directories in a tree-like format. Tree is basically a UNIX port of the DOS tree utility.
wget
GNU Wget is a file retrieval utility which can use either the HTTP or FTP protocols.
unzip
The unzip utility is used to list, test, or extract files from a zip archive.
zip
The zip program is a compression and file packaging utility.
4. CentOS VirtualBox Guest
If you are setting up CentOS as a guest operating system on VirtualBox, there are some additional steps outlined on the CentOS Wiki.
Directory Structure:
With the Tree command installed, you can get a good view of the system.
tree -C -L 1 /
- /bin – command binaries (this is actually a symbolic link to usr/bin)
- /boot – boot system
- /dev – is for system devices
- /etc – system configuration files
- /home – user home directories
- /lib – library for system binaries(this is actually a symbolic link to usr/lib)
- /lib64 – (this is actually a symbolic link to usr/lib64)
- /media – mount point for removable media
- /opt – third-party software packages
- /proc – system and process information
- /root – root user home folder
- /run – information about running processes
- /sbin – system binaries (this is actually a symbolic link to usr/sbin)
- /srv – files for various services
- /sys – similar to /proc
- /tmp – temporary files
- /usr – another place for applications
- /var – variables files, such as logs
For a more detailed explanation of the directory structure see LinuxInsider’s The Filesystem Hierarchy Standard.
YUM Commands:
Search for an application and description
yum search zip
Display information for an application
yum info zip
Install an application
yum install zip
Check if an application is installed
yum list zip
See all applications installed
yum list installed | less
Remove or Uninstall an Application
yum remove zip
Check What Needs Updating
yum check-update
yum list updates
yum list available
Update All
yum update
Yum Commands for Groups
yum grouplist
yum groupinfo "group name"
yum groupinstall "group name"
yum groupremove "group name"
yum groupupdate "group name"
Repositories are setup under:
/etc/yum.repos.d/
List Enabled Repositories
yum repolist
Display All Repositories (Both Enabled and Disabled)
yum repolist all
YUM Clean Commands
yum clean expire-cache
yum clean packages
yum clean headers
yum clean metadata
yum clean dbcache
yum clean rpmdb
yum clean plugins
yum clean all
Yum Plugins
yum info yum
Yum Delta RPMS
To save bandwidth Linux has a feature that allows you to download only partial changes instead of entire packages. Not all repositories utilize this and the minimal install of CentOS does not enable delta rpms. Enabling this feature is a trade off between CPU utilization versus bandwidth. Learn more about delta rpms.
yum install -y deltarpm
Disk Space:
Use the Tree command to list out directories. To see disk space use the df and du commands.
df -h
du -hd1 / | sort -h
Find Large Files
Use the find command to list out the largest 10 files.
find -type f -exec du -Sh {} + | sort -rh | head -n 10
find / -type f -exec du -Sh {} + | sort -rh | head -n 10
find /home/username -type f -exec du -Sh {} + | sort -rh | head -n 10
find /var/log -type f -exec du -Sh {} + | sort -rh | head -n 10
Find files larger than 100 Megabytes (ignore the false positives for /proc)
find / -size +100M
Find files created in the last day
afind -ctime -1 -ls
Run the YUM Clean command
yum clean all
Remove Old Kernels
This will remove all kernels, except the previous one. Note that if you recently installed a new kernel, you need to reboot the system first and then run this command. You can also increase the count to equal higher than 1 to keep more than the previous kernel file.
package-cleanup --oldkernels --count=1
Delete Old and Rotated Log Files
List out the log files first and then delete.
find /var/log -type f -regex ".*\.gz$"
find /var/log -type f -regex ".*\.gz$" -delete
find /var/log -type f -regex ".*\.[0-9]$"
find /var/log -type f -regex ".*\.[0-9]$" -delete
Additional Resources:
by WebKeyDesign | Jun 6, 2017 | Apache, Linux, Webmastering
It has been over ten years since I started this website. WebKeyDesign was the outcome of my rebellious nature and my love of technology. In 2005, a few of my coworkers liked having discussions to pass the day and so we setup an online forum on one of our Windows 2000 computers. This lasted for a few months only. As you can imagine our network admin did not share our fondness for discussing the merits of Star Wars versus The Matrix Trilogy. The online forum was shut down and hence I took it upon myself to setup a simple PERL forum script with my ISP. The forum worked well enough and was accessible over the Internet, but my ISP home space was limited and we would quickly outgrow the space allotted. The obvious solution was to open my wallet and purchase a domain and hosting space. In a matter of months our little forum grew into a proper forum solution using Invision PowerBoard and a cPanel Linux host. Interest in the forum ended up waning after a couple of years, and in the end all that was left was my curiosity with all things web server related. WebKeyDesign was the next logical step. The idea was to help people setup some simple websites and keep learning more about web hosting and web servers.
Today, you can find multiple solutions to web hosting and application services. There’s AWS, Google, Microsoft, and other cloud providers who all can provide robust internet services. There are also multiple shared web hosting solutions and virtual machine providers like DigitalOcean. The choices available today are significantly more varied than they were back in 2005. However this post is about where to start and some lessons learned.
The Poor Man’s Sandbox:
Much like in business, your budget is most likely fixed. You can only spend so much. Although you can setup a web server on your iMac at home, or even a Raspberry Pi nowadays, you don’t really want to do this. There are multiple reasons to advise against a home web server setup. Primarily, you want a web server to be stable and reliable. Your home internet connection can go down due to power failure, ISP issues, or even your dog pulling out your ethernet cable. There are just too many points of failure with a home web server. You will want to have a hosted solution.
Any hosted solution is going to cost you. You can save some percentage of funds by paying up front for a year’s worth of hosting, but when you are starting out, it is best to just pay monthly. You want to have the option to try multiple hosting companies. Some people find that a typical shared hosting option is what they are comfortable with and some other people will want to have cloud services like AWS. Give yourself the time and options to try different hosting solutions.
Server Operating Systems:
There are equally a wide range of operating systems on which your web server can run on top of. Given that the title of this post, you already probably know what I am going to recommend, but let’s summarize some of the other options.
Microsoft Windows Server is an operating system that most technical people may already be familiar with. Microsoft has a long history in the server space and today’s Windows 2012 and 2016 servers are available as virtual instances from Microsoft and other cloud providers. However Microsoft systems are not free and have licensing costs. You will find Microsoft to be more expensive than your typical Linux based solution. The other reason to not start with a Microsoft solution is that Windows Server is usually associated with Microsoft’s own IIS web server software. The Microsoft stack is a closed commercial solution with some exceptions.
Pretty much web servers tend to be UNIX based. There are multiple versions of UNIX or UNIX-like operating systems. There is BSD, Oracle Solaris, and then there is of course Linux. You can find hosting solutions based on BSD, but most like you are going to find lots and lots of Linux based solutions. Even though everyone calls it Linux, the Linux system is really the Linux kernel and GNU put together. There are many distributions of the Linux system, but most of them fall into three categories: business oriented Redhat, Debian based, and what I call bare bones like Arch Linux. If you are not comfortable with UNIX, most often you want something like Redhat or the Debian based Ubuntu distribution. For most of us, it comes down to choosing between Redhat and Ubuntu.
Redhat is in the business of supporting business customers and so the Redhat OS is not technically free. You have to pay for support. Redhat creates a free open-source distro named Fedora, from which they make the RedHat OS. Fedora has frequent changes and kind of like a developer’s OS. However from Fedora we also get a community driven stable distro known as CentOS. Most Linux based hosts will run CentOS because it is stable and free. Ubuntu on the hand is a very popular Linux distro that is based off of Debian. Many developers like Ubuntu because it is updated more frequently and is also free. While CentOS concentrates on stability, Ubuntu is updated more frequently. Either OS is a good choice for web services, but my personal leanings are on CentOS. If you will be developing software in the future or as your day job, CentOS is going to be closer to what businesses use for their systems. If you plan on just developing software for your own personal use or just want to learn some things, than Ubuntu is good choice.
Control Panels:
Even though server operating systems like CentOS and Ubuntu exist, an additional cost that does get passed on to you is control panels. In the world of Linux there is cPanel and there is everyone else. cPanel costs are significant and they pretty much control the control panel market. Large hosting providers get cPanel licenses cheaper than everyone else, so they are cheaper when it comes to providing cPanel as part of their solution. cPanel can control a shared web hosting solution, to a reseller account, to a virtual machine, to the entire fleet of physical web servers. It makes supporting and managing web servers easier for hosting providers. There are other hosting providers who have other control panels or roll their own much like Amazon AWS.
There are virtual machine providers like Linode and DigitalOcean who provide basic controls for the virtual machine and nothing else. You are free to control your virtual machine through SSH Secure Shell, or purchase an individual license for cPanel or any control panel on your own. One option is install the free GPL version of Virtualmin, while not as user friendly as cPanel, it can provide most of the functionality of cPanel. However note, that the farther you go away from a typical shared hosting solution, the more you are on your own as far as technical issues and support.
Apache or Nginx:
With a shared hosting solution or reseller account, the web server will most likely be Apache or Nginx. With a virtual machine or cloud solution you may be able to install the web server itself. You can’t really go wrong with learning either of these popular http web servers. Most of the knowledge that you learn from Apache can apply to other web servers, so if you have to pick one to start with, I’d recommend Apache.
Webmaster:
Once you have your sandbox up, it is time to setup a WordPress blog, a simple website, or perhaps write your first web app.
The Linux Web Server is an incredible piece of software engineering. It has allowed me to pursue my interests in computing and communicate to people all over the world. It has quite literally changed my life and I have no doubt that it can change yours.
by WebKeyDesign | Nov 9, 2016 | Apache, Linux, Software, Webmastering
Security is now a central concern for technical people and I would argue for most consumers. It is now typical for criminals to target banks, hospitals, and other critical institutions. Privacy is also an issue that is central to a free and progressive society. One solution that gets thrown out is SSL encryption for websites and how we all now need to secure our sites with an SSL certificate. Due to the market though, SSL certificates are one of those things that companies have a hard time making money off of. Most people do not buy SSL certificates, so you wind up with a market that sells bare bones SSL certificates that range around $25 and extended validation certificates for large ecommerce websites that cost thousands of dollars. This is where Let’s Encrypt changes things. Their certificates are free and are recognized by the web browser as a valid secure certificate. This makes SSL encryption a zero cost option for millions of individual webmasters who run websites like WebKeyDesign. There is one other difference with Let’s Encrypt certificates: they are limited to 3 month intervals instead of yearly intervals. However what makes Let’s Encrypt more appealing to webmasters is that the software makes renewals automatic and there is now software integration with cPanel and Virtualmin control panels.
My personal project is a virtual machine that I keep for journal purposes. It allows me the ability to write some thoughts and archive information for later viewing. The virtual machine runs CentOS 7 Linux and can be controlled using Virtualmin. The SSL certificate that was originally setup was self-signed and so I would have to manually add the certificate to iOS, MacOS, and make exceptions in browsers in order to use the website.
Update:
Since writing this, a few things have changed. Let’s Encrypt now requires version 2 of their protocol and old clients are no longer supported. Virtualmin needs to be updated to support the new client. You can read more about the issue on this Virtualmin Forum post. To have this work, on Centos 7, do the following first and then it should work.
yum install certbot
certbot register
I followed TechJourney’s excellent guide: How to Use Let’s Encrypt SSL Certificate Automatically in Virtualmin & Webmin. There were a couple of issues I found out along the way.
Webmin Configuration
The tutorial did not specify the path to the client command. For CentOS, I found this to be:
/root/letsencrypt/letsencrypt-auto
This may not be needed. I was able to let Virtualmin automatically find the new client.
Let’s Encrypt SSL for Webmin Login
A secondary problem that I ran into had to do with the separate subdomains. The Apache webserver will respond on your typical www.mydomain.net and mydomain.net, however the Webmin control panel is accessible by another prefix to mydomain.net. Under Virtualmin – Server Configuration – Manage SSL Certificate, the default will be Domains associated with this server. This setting will only pull in the domains that Apache is setup for. If you want to use the Let’s Encrypt SSL Certificate for other subdomains, you have to select Domain names listed here and manually type all your subdomains. You can then under the Current Certificate tab use the Copy to options and use the same certificate for Webmin, Usermin, etc.
If you went ahead and hit the Request Certificate button and then try to add domains, the process will error out. There is no way to reset the certificates from the Virtualmin interface. To resolve the problem, use secure shell and remove the letsencryt directory.
rm -rf /etc/letsencrypt
This allowed me to use the Request Certificate option again and have all my subdomains added to the certificate.
by WebKeyDesign | May 9, 2014 | Linux, Networking, Software
SARG Reports are a good compliment to Squid Proxy and since there is a package that is available for installation in pfsense, it makes good sense to setup SARG Reports. The downsides to SARG Reports is that the reports do take up space and over time this can be significant. This posting is about a problem I encountered on pfsense 2.1 and the latest SARG package.
For some unknown reason the reports stopped generating. Upon checking my System Log this is the issue I found:
php: /pkg_edit.php: The command 'export LC_ALL=C && /usr/pbi/sarg-amd64/bin/sarg -d `date +%d/%m/%Y`-`date +%d/%m/%Y`' returned exit code '1',
the output was 'SARG: Cannot get the modification time of input log file /var/log/squid/access.log (No such file or directory). Processing it anyway SARG: File not found: /var/log/squid/access.log'
I am using the 64-bit version of pfsense, so hence the sarg-amd64. If you are using 32-bit, it will state instead sarg-i386.
The solution is to edit the sarg.conf file that is located in one of these locations, depending on your pfsense build:
/usr/pbi/sarg-amd64/etc/sarg/sarg.conf
/usr/pbi/sarg-i386/etc/sarg/sarg.conf
You will need to verify that the access_log line is correct:
#access_log /usr/local/squid/var/logs/access.log
In my case, removing the # sign and specifying the correct path to my Squid access.log corrected the problem.
If you have issues with SARG Reports, it is best to do the following:
- Under the Status Menu – click SARG Reports.
- On the General tab click Save
- Next click on the Users tab and click Save
- Click Schedule and create your schedule or if you have one already open it up and click Save.
- You can go back to the Schedule and Force Update to see if SARG Reports are working now.
I also schedule SARG Reports in Cron to run at 11:50pm every night instead of midnight.
50 23 */1 * *