To learn more about Internet services, I built my own home web and mail server.


I used an Acer Aspire E360 that was going to be thrown away to build a Plone web server, Dovecot mail server, and Postfix mail transfer agent.


I installed CentOS 6 and set up Apache as a proxy to redirect traffic to the Plone content management system. I then set up Postfix, Postgrey, and Dovecot for mail services. I added Fail2Ban and SSH to prevent unauthorized access.

Terminal window with IPtables output

I also disabled PAM authentication unless logging in from TTY1 for further security. I set up DuckDNS as a dynamic domain name system and, as a result, I had to use self-signed certificates for HTTPS.

IP Address Reservation in router

Once the server was setup, my router had to be configured to allow connections from the WAN. First, I configured reserved IP addresses in the LAN using MAC addresses to ensure the IP address is given to the correct computer.

Port Forwarding of services in router

Now that I have an IP address to forward traffic to, I forwarded the common server ports I used to that address.


External HTTP port 8080 was forwarded to port 80 internally as an attempt to work with my ISP blocking port 80 by default.


I succeeded in setting up an old PC as a home server.

Terminal window with htop output


Currently, the computer is running stock hardware and my short term goals are to either upgrade the current system (needs more RAM) or build a custom system. If anyone has used hardware, please Contact Me.

March 2017 Update

My home server was shutdown because of blocked ports and a data-cap. It may be used for another project later on.

I now have a shared host server as a web/mail/file server. I am unable to install the Plone CMS on the shared host, so I use a JavaScript based wiki called TiddlyWiki.

In the future, I plan to migrate the whole site to AWS and rewrite it in Python.

December 2018 Update

Migration to AWS is complete: SES and WorkMail handle e-mails. Route 53 takes care of DNS services for my domains. S3 holds all the files and works with Route 53 to direct traffic.

Finally, ACM provisions and manages the SSL/TLS certificates.

Time will tell how much I’ll save with the migration, but I look forward to it!

Next, I’ll be rewriting it in Python so that the HTML files are smaller.

Second Update

Website has been rewritten in Python using the Pelican package and AWS CLI. Took a solid two days, but writing content in reST and uploading everything to S3 is a breeze now!

Not only that, but the HTML files are tiny compared to before! From 2 MB down to 50 KB?! Snappy!

Website analytics will also improve because now individual page views are tracked so I can see what pages need more content or refactoring.

I’m also using Bitbucket to keep track of all my source files so that I can work on my website from anywhere.

My old website is still available at TiddlyWiki.JoseALerma.com, but don’t expect to see many updates there anymore.

There’re still a few more things I want to do with this site: it still needs a sitemap file, and since it is written in Python, custom scripts can be made to help automate writing and publishing articles.

Until then, Contact Me if you find any typos, broken links, or can think of improvements!

January 2019 Update

Since the files are smaller, I’m now using AWS CloudFront to distribute the website around the world. As a result, page load speeds should be nearly instant so long as AWS has an edge location in your region. However, I also had to setup a Lambda@Edge JavaScript function to append index.html to URIs without a specified index page because of how CloudFront interfaces with S3 origins.

If your browser still reports the site as being ‘Not Secure’, that’s likely because I recently updated Pelican to use the https protocol and it may take a while for CloudFront to get the latest files.

Site now mapped! I used a handy Pelican plugin to generate the sitemap. Truly, we are not worthy of Pelican’s awesome-possumness!

With the new sitemap, Google search should do a better job at indexing the website and providing relevant links for keyword searches.