I just caught up with an old friend and walked through what I’ve been up to in the (many) years since I departed Texas. This friend isn’t a real techy, so I had to take a higher-level look at the various companies and projects I’ve worked on over the last [number redacted] years.
Half-way through the list I realized that almost every project revolved around some form of virtualization. And not just the ”virtual machine” version of this term, but the more general english definition of “separating out a logical view from its physical implementation”. The list runs something like:
- MPEG hardware: My undergraduate research thesis focused on building hardware to accelerate the decode of video streams (yes, the early MPEG-1 days!). The data stream had to always stay the same, it was just up to the hardware to more efficiently convert it to useful video.
- Ada compiler: I also spent a summer at Convex Computers (now part of HP) working on a compiler that could unroll loops and optimize unmodified Ada code to utilize the company’s vector hardware. It sure would have been simpler if we were able to add hints to the source code, but that wasn’t allowed. The challenger (against Cray in this case) often doesn’t have the luxury of asking for changes specifically on their behalf.
- SimOS: My dissertation focused on a complete machine simulator capable of running unmodified IRIX and IRIX binaries. This was a major pain to get right (and fast), but allowed us to study real life applications and get previously unseen visibility into system performance.
- MIPS R10000: While at the tail end of graduate school, I worked at SGI for the MIPS architecture group to help design their newest processor. While MIPs has one of the simplest instruction set, backward compatibility still was a pain that restricted several possible optimizations.
- VMware: Don’t need to say much more here. Whether for servers, storage, networking, or desktops, the engineering obsession was always about allowing completely unmodified applications to work seamlessly in a more agile, portable, and efficient environment. Early attempts to simplify this challenge (paravirtualization, for example) sure sounded nice, but we knew that they created a barrier to adoption that would be hard to swallow early on.
- Recent Investments: And my early investments at General Catalyst have all focused upon this as well. The two that most exemplify this passion are still in stealth mode, so stay tuned for a proper unveiling. Both of them work with existing workloads and user behavior, surreptitiously doing things behind the scenes for dramatic improvements.
I meet so many startups that offer IT Nirvana if you just ignore existing hardware and software. At the end of the day, the requirement of working with existing applications, code, or environments is a pain. It’s always easier to have a completely “greenfield” and no compatibility requirements… which reminds me of this quotation of unknown origin:
“God created the world in seven days — because he had no legacy infrastructure”
But today’s businesses do have legacy infrastructure and a slew of existing applications, processes, and user behaviors. While always keeping an eye out for great clean-slate solutions, I suspect I’ll continually come back to those that also try to fit in!
Just a little retrospective navel-gazing for a sunny Tuesday…