Cool. I hope I get the chance to read it.
Any digital computer (based on binary on/off signals) works with these basic logic gates. When you take a digital circuits class, you start with these gates, then you use them to build little circuits called "flip-flops" and from there you build counters, accumulators, comparators, etc.
This is the way it it has worked since vacuum tube computers. The basic logic gates haven't changed. What has changed is that we have figured out how to pack millions of these logic circuits (counters, accumulators, compactors) into very small semiconductors. But the basic building blocks haven't changed.
There were some experiments done with analogue computers (where the signal was a level rather than binary on/off). As far as I know these never had any use outside of a few very specific applications.
But to answer you question. Vacuum tubes use the same basic building blocks that are used on semiconductor circuits... and so, theoretically if you have infinite money space power and time, you could build any circuit using vacuum tubes.
One measure of hope fast a computer works is in "FLOPS". FLOPS stands for "floating point operations per second" and you can think of this as how many pairs of numbers it can multiply in a second. "The ENIAC (a very powerful vacuum tube computer) ran at 500 FLOPS.
Your typical cell phone can handle 1,000,000,000 FLOPS (that is a billion). So yeah, getting any result on your vacuum tube computer is going to take a lifetime.