The one thing about servers is that, if properly maintained, they just keep going and going and going.
I still have some Dell PowerEdge T410s that still hold up our on-prem infrastructure, which we'll retire before too too long, although one might say they've been in operation for too, too long. I've built a HCI out of Dell Minis, to teach the concepts of clustered computing.
After Broadcom's acquisition of VMWare, we might see some changes in their pricing (probably more expensive), but I've never, really been a fan of VMWare - mostly because they charged the same for licensing for academic institutions, which to me, is asinine (or at least they did when I monkeyed with it). Now, for virtualizing, I would just teach QEMU/KVM for Linux platforms, HyperV on Windows, and VirtualBox to teach the concepts. Sure, you lose the advanced enterprise management stuff, but to me, a 0-2 year shouldn't be touching that stuff anyway.
But I'll agree with Tess here, having a couple old bare metals at the ready.
Because, remember this:
There's no such thing as the cloud - it's just someone else's computer. And the cloud is made up of servers.
/r