Reflecting On My Progress With Virtualization

Virtualization, when I first encountered it, was a very confusing thing.  After spending 2 years using virtualization for almost every project and investigation I have done, it’s fun to look back on how far I’ve come in getting a handle on using virtualization to progress in my proficiency in everything I do in cybersecurity.

As I was creating a Splunk lab this week, I began to think back to about a year ago when I tried to follow a tutorial for doing the same thing.  I remember just carrying out the commands as they were presented, and not really understanding what was going on.  If I couldn’t follow a particular command, I was pretty much stuck.  I simply did not have the techniques I do now for managing a large project like a homelab that involves multiple elements such as a splunk indexing server, a web server, and other components that simulate a realistic network that allows you to get some good hands-on experience with splunk.  While I was setting up the various components of my splunk lab, I realized I was carrying out from my own understanding what I had struggled to understand while following a tutorial.

I’m bringing this up to emphasize how important it is to get this hands on experience, and spend a lot of time spinning up virtual machines for various purposes.  What it has resulted in is the ability to take whatever systems I need to incorporate into a lab environment, and connect them together for more complex projects and advanced subjects.  It’s one thing to be able to spin up a Linux machine, but it’s exhilarating to be able to network several Linux machines together to simulate a small office environment.

I am glad that I broke out of the trap of thinking that the kind of experience I need to get comes from physical hardware.  When I first started, I was planning on going all out and buying thin clients, server machines, and rack mounted hardware to simulate a network environment.  Then I watched a video by David Bombal where he remarked that this is really unnecessary and an outdated approach to building a lab environment.  All you really need to do is virtualize everything and you can get the same experience.  From that point on, I put aside the desire to purchase all of that physical equipment and just go with virtualization.

When it comes to making these complex homelabs that I’m working on now, there are a few techniques that I would not be able to manage these large scale projects without.  First, you need to organize your homelabs into some kind central notebook, with an easy way to refer to the purposes of each machine, as well as username and password information, subnet information, IP addresses, and so forth.  Second, it’s good to come up with standard operating procedures for installing these virtual machines, as you will find yourself repeating these things time and time again.  Of course, virtualization leads to topics like snapshotting and outright copying entire virtual machines for quick deployment, which is something I don’t make us of as much as I should, save for snapshots, which is crucial for testing and rolling back to a previous point in time.

Leave a Reply

Your email address will not be published. Required fields are marked *