Tag Archives: flipped classroom

Are physical machines still the better choice?

Dell Optiplex 755This post is a bit of a ramble. My apologies in advance. Just looking for advice, I guess.

If I’m going to dive head-first into producing videos for my upcoming online Introduction to SQL course, plus supplementary “flip-classroom” videos for other courses, I need some horsepower. For screen recording and video editing, I’ve purchased an i7 iMac with Fusion drive and 8GB of RAM. That should be arriving next week and will replace the Mac mini I’ve been using the past year.

For a database, I’ll use the same PostgreSQL backend software that I use for my on-ground class. But since the videos will be publicly accessible on YouTube, I don’t want to use the same server machine. That is, I don’t want the public to see exactly what my students will see, since doing so may pose a bit of a security risk. Consequently, I’ll run the database on a separate machine.

A few options are available:

I have a small Dell Optiplex 620 “Slim Form Factor” machine sitting in a closet. It’s the machine that ran the database for my previous videos already on YouTube. It’s perfectly adequate for basic stuff and it has an Nvidia GT 610 in it for simple CUDA programming, but I’m afraid that if I start working with larger datasets, it may start to creak. The temptation to upgrade to something snappier is looming large.

I could throw together an Intel i3 machine out of parts for about $400. As a plus, I can toss into it a Nvidia GTX 680 I happen to have lying around for bigger CUDA projects.

I could purchase a used, but more current, Optiplex 755 or 780 from Craigslist for about $180, but I wouldn’t be able to put anything larger than the GT 610 into it. Still, it would run the database with ease. I like small machines that I can tuck into an out-of-the-way place.

I suppose I could run the Ubuntu server inside a VirtualBox on the soon-to-be-unused Mac mini. It would be difficult to run the machine headless, though.

But then I think: for $400, I could rent a Linode virtual machine for almost two years! And for the few times I need more oomph, I can temporarily fire up a more capable one for just a few dollars. When I want to do CUDA, just rent an Amazon EC2 GPU machine at 35 cents/hour.

A few years ago I promised myself I’d stop running server hardware and just go virtual. Why am I still thinking about buying new stuff? What is it about spending a few hundred dollars up front that seems better than paying a dozen dollars every month?

Having a local machine means zero latency and no setup necessary when I just want to record a video or two; I can shut it down when I’m done. A virtual machine means setting it up from scratch every time I need to start it up. (Some StackScripts could help with that.) A low-end local machine would satisfy 95% of my needs. A more capable machine would satisfy 100%, but at 2x the expense.

What would you do?