I thought I was too harsh every now and then, but I’m a complete amateur when compared to Minh Ha’s take on OpenFlow.
Indeed Quantum Computing and OpenFlow have a lot in common. They both create stories that have emotional appeal, they both require invention of new physics, and they’re both filled with grand vision, grandstanding, and empty promises. But there’s no shortage of PhDs, high hopes, cash infusion from VCs, and a Cambrian explosion of research papers, many of which content is not even worth the papers it’s printed on.
From what I see, OpenFlow or SDNv1, is just a rebranding movement to create hype and gather fund for areas that need no change and no new buzzword. Network management/automation, virtualization, and even whiteboxing, were all a natural evolution of the virtualization resurgence in the 2000s following their server counterparts, and OpenFlow added nothing to the paradigm to further our knowledge, except proving the obvious that centralized control plane and even centralized controller done literally, just don’t scale.
It requires the invention of new physics to work efficiently, and what a shame its proponents, a lot of them high-end academics who did bleeding-edge research in networking and were supposed to know better, only realized this a few years down the track. OF originators probably believe in Multiverses and think OF can be made working in one of their Many Worlds. And when it turned out to be a fiasco due to the need for new physics which doesn’t exist in our Universe, the SDN hype got re-associated with all other sorts of things like network automation, and then programmable data plane. In recent years the disease of programmable data plane seems to be in full swing; they seem to be the next big thing, and they want network computing, probably taking advantage of the great power of AI, to be the next paradigm shift.