Skip to main content

MEET THE ARCHITECT Jonah May! Mr. Hackathon and Winner of the "I hacked my house" Award!

  • 16 September 2024
  • 2 comments
  • 27 views

Hi Everyone!

 

We are back with the “Meet the Architect” series. It is appropriate that after our last guest, Mr Hackathon Maurice Keevenar, that this week with have the other Mr. Hackathon Jonah May!

If you did not already know, both Maurice and Jonah run the Veeam Hackathon, which is coming up on September 26, 2024!

Jonah is one of the smartest folks that I know and he takes the cake when it comes to hacking as he started out at a young very young age testing the limits of IT security, in his own house!

 

HACKING AT HOME

 

Geoff: Hi Jonah, I have to start this interview off by asking you to tell the story of how you first started to interact with computers. I know this story already and find it not only insightful but very funny, something about you hacking 🙂 in the house?

 

Jonah: 

Yes, my father who is also a Vanguard and Ace, really likes to tell that story doesn’t he? Well, let the record show that I was a Vanguard first.

 

Anyways, when I was a child, I was given a Windows 98 PC when my parents decided to upgrade to either XP or Vista. It was mainly so that I could play CD-ROM games offline, mainly a Carmen Sandiego game (ThinkQuick Challenge possibly?) and Lego Island.  Apparently, I had two instances of “hacking” said computer. The first was that at some point I became so familiar with the audio and subtitles of the games that I changed the language settings and used the said familiarity to start teaching myself French. Unfortunately, none of that has stuck with me with how long ago it was. The best I have is some remaining familiarity with the Romance languages from several years of Spanish class, first in pre-AP and later in IB in secondary school. The other “hack” of said PC I’m told was figuring out on my own that if I added a default gateway to my computer, I could access the internet with it as I guess it was connected but missing the gateway so my parents could install updates and games on it for me as needed. Kind of similar to the language one, I don’t really remember that one either due to my age at the time.

 

The story I do remember, is taking a Raspberry Pi back in the early Alexa days and installing a GitHub project on it that leveraged Amazon APIs to create my own Echo and tie it into a few things around our house, namely the alarm system. I think I was a preteen and went to a friend’s house where I was shown the first- or second-generation Echo and wanted to have it at home. At that age I only got a small weekly allowance from chores and didn’t want to spend money on an Echo (or maybe my security-minded father wasn’t keen on the idea of them at the time?). So, I did what any know-it-all 12-year-old would do and borrowed his younger brother’s Raspberry Pi that was supposed to be for hobby projects. I never bookmarked the project I used, but I think it may have been this one: https://github.com/alexa-pi/AlexaPiDEPRECATED. Looking back it may also have been how I was introduced to Reddit, though I wouldn’t create an account for another several years.

 

 

CAREER START

 

Geoff: That is a great story, you were so to speak "A natural!" I assume that it did not take you long to choose which profession you wanted to enter? How did your official IT career start and how did you eventually land on the Veeam dream as I call it (since Veeam has been so beneficial to so many of us in our careers).

 

Jonah:

I’ve pretty much always wanted to work in tech. Back in high school I had tried to go into some networking and server management classes at our school district’s career and technical education center. It would have been some really fun classes and helped me go to college with a few certs like a CCNA. Unfortunately, I was in a combination of AP and IB classes and couldn’t make it work with my class schedule. I served as head programmer of one of our robotics teams as well as team captain for one year. Then, when my team was eliminated at the state level, I helped our sister team at the national competition during my senior year.

 

My IT career started in my second semester of college. I had just transferred from UNT (Computer Science) over to ASU (Software Engineering) as I had started working at Starbucks about six months before and they offered tuition reimbursement as well as letting me shift to online classes and gain some flexibility. Around that time, I was also looking to start a second job to start gaining some tech experience. It started with me applying to the Geek Squad and Apple Stores and ended with a part-time paid internship at a VCSP in Dallas. I would go in for 20-25 hours to build servers, swap tape drives in our data center, and help with the Tier 1 support queue as well as other simple tasks. My first major project was upgrading Veeam to 9.5 Update 3 on about 150 customer VBRs that we managed. Then in the evenings and on weekends I would go work 30-40 hours doing closing shifts at the Starbucks around the corner from my apartment.

 

Over time, I worked my way up into a full-time position and several promotions, eventually ending up in a Tier 3 support role while also helping our chairman with development tasks as we were a relatively lean-staffed company for our size. Around this time, I also earned my first VMCA and about a year later, I became a Vanguard. Eventually, I moved over to my current company, where I’m trying to re-focus some of my time on finishing off my bachelor’s. As of earlier this year I’m enrolled in the Computer Science program at WGU and I’m currently working through my last 20 or so credits, while also earning some certifications outside of the VMCE and VMCA, like ITIL Foundations.

 

 

FAVOURITE VEEAM FEATURE

 

Geoff: 

That is what you would call working your way up.

Would you agree that working for a Service Provider gave you "work experience" on steroids? Basically, you are fast-tracked and introduced to every imaginable environment, and you have to think quickly on your feet. Some folks take years to get that kind of experience at normal IT shops.

What is your favorite Veeam Feature? and what is the feature that you have used the most during your work. They don't necessarily have to be the same.

Jonah: 

I agree with that. I’ve always referred to it as the “trial by fire” or “drinking from a fire hose”. It helps that I came into an organization that was quickly growing and forced to mature and address both growing pains and technical debt. For example, we were also at the time a Storagecraft shop. There were a number of legacy systems out there still using Server 2008 R2 with zero Windows updates installed and firewalls disabled. One of my earlier tasks was figuring out what ports Veeam, Storagecraft,and our custom reporting tools all needed so firewalls could be enabled. Then it was scripting unattended upgrades to 2012 R2, 2016, and 2019 using our RMM software to remotely upgrade 150 servers as we were approaching 2008 R2’s EOL date.

 

My favorite Veeam feature is a hard one. I haven’t been around quite as long as other people, but I have been enough to see the product evolve significantly. Right now, I’m a big fan of the recently added Proxmox support, though I feel it’s a little limited in v1 and I’m excited to see what future releases add. The feature I use most often is definitely Instant Recovery. Something my previous and current service providers do that not all VCSPs offer is letting you recover your BaaS workloads into our data centers, so customers can get near-replication RTOs without having to pay all the price premium that usually comes with keeping 1:1 replicas. At my last company, I laid the framework to mostly automate the process for DR testing. I built a REST API that leveraged Veeam’s APIs and PowerShell cmdlets, with a UWP app frontend that would run on employee computers. By the time I left, we had Tier 1 engineers able to spin up customers, including virtual firewalls and VPNs, in under an hour. I’ve had discussions about building something similar with the dev team at my current company, but there are a lot of things we want to do for our customers and we’re working on prioritizing all of them.

 

THE FUTURE

 

Geoff: Finally I want to ask you what your vision of the future holds in IT and Data Protection?

 

Jonah: 

First off, I think we’ll continue to see an increase in adoption with regards to object storage and security-oriented features. Security will be nice to see so long as people remember that it is supplemental and separate from existing policies and infrastructure, not a replacement. In my opinion, as Veeam continues to rollout features, there’s spots where ties into your existing infrastructure make sense, such as a SOC/SIEM integration, and areas where it doesn’t, such as inline malware scanning and Secure Restore. If my AV in production did not catch a penetration, then why would I expect it to catch anything new in malware scans of the backups?

 

For object storage, it is a powerful technology, but I think it’s a bit ahead of the curve for many small businesses and MSPs, at least when it comes to utilizing it on-premises. I love the idea of the scalability, resiliency, and easy maintenance it affords, especially as a service provider. However, as you know pretty well, it doesn’t really work well at a small scale with implementations like Ceph or MinIO. Object First does a great job of letting people get their foot in the door, but for a smaller, budget-conscious business, it is already hard enough to convince executives and finance to invest in both a VBR and a hardened Linux repository running on older hardware, let alone a shiny, new system that might have more storage than the business actually needs at the time. I still see a lot of implementations where customers are using Windows and ReFS with Veeam. I don’t think it is going away any time soon, though v13 letting you run the VBR on Linux will definitely help.

 

I’m not entirely sure how the trend will evolve, but I think we’ll see AI continue to work its way in as well. Veeam has done a pretty good job integrating the chat assistant into 12.1 and 12.2. I imagine with time we’ll continue to see the model improve and become better at helping troubleshoot things like job errors and suggesting fixes. I also have a hunch we’ll see it leverage the Data Protection API and/or Veeam ONE at some point for intelligent diagnostics. Maybe as an evolution to existing reports, such as the suspicious file activity and capacity planning systems?

 

However, I don’t think the hardware is quite there yet to do this. I, and I’m sure many other people, would prefer for data like that to remain local, rather than talk to OpenAI/Azure. Heck, I’m in the process of building my own locally powered voice assistant for my home in part because of no longer wanting to send my data to Amazon (Alexa has also gotten extremely dumb in the last year or two IMO, but that is a whole separate discussion). Until we see a point where LLMs and other CUDA/ROCm powered workloads can run locally and perform well without investment into expensive hardware like datacenter-class GPUs, I think adoption will be small, even if the capability were to get baked into Veeam tomorrow. Things like the Ryzen AI series are a good starting point, but I’m sure it will take a few more generations for chip manufacturers to really get it figured out and cost effective.

 

I think there’s also a lot of room for optimization being left on the table, in part because some of it comes at the cost of VRAM usage. TensorRT is a great example of this. It’s an Nvidia library that can run on top of most CUDA-powered GPUs, including the GTX 1080 from what I’ve seen online. I’ve tinkered with it some, even going so far as to utilize it for OpenAI’s Whisper with speech-to-text processing. It seems to perform about 3-5x faster than native Python and C++ libraries using regular CUDA, at the cost of using close to 10x more VRAM. Anyone interested in learning some more can visit my GitHub repo here (https://github.com/JonahMMay/wyoming-whisper-trt). I have some quick benchmarks, a Docker container, and links to TensorRT to learn more. It isn’t anything fancy, just an attempt to combine multiple other GitHub projects out there to lower processing time.

 

Thank You Jonah!

 

 

 

Another great interview with Jonah.  👍🏼

Love reading these when they are posted.


Another great interview with Jonah.  👍🏼

Love reading these when they are posted.

Fastest commenter in the west over here. You may have read this all before I did, and Geoff sent me the link the instant he posted it!


Comment