Today's convergent core technologies prefigure tomorrow's converged applications.
by John Jainschigg
We know, we know. We're supposed to be a communications product/application magazine. Finished products and applications, ready for purchase and installation. Products and applications used to drive carrier revenue. Products and apps used to achieve business goals: to communicate, collaborate, reach customers, transact.
So why are we devoting most of Communications Convergence to core technologies? To chips, electronic subsystems, high-availability computing, media processing boards, and media servers?
Simple answer: This tech will drive the next generation of devices, infrastructure, and apps. The nature and specifics of these core technologies will govern, in part, what we do with them. So careful review is a useful predictor of things to come: new toys, new businesses, new opportunities for (sober) investment.
The contribution of this tech will be both limiting and liberating.
Core tech selection factors heavily in the success of technology products (and viability of business plans) - most directly by determining whether designs are feasible. If your DSP dissipates too much heat, your beautiful design may not scale.
And that's just the beginning. Core tech influences cost of development, and the degree to which development requires partnering, licensing of intellectual property, OEM partnerships; and the facility with which such collaborations can be pursued. It controls time-to-market - which controls access to capital, to "buzz." And it influences cost of manufacturing and the price of finished goods.
Even beyond the specifics of getting a product to market, core tech determines how much of the effort can be leveraged in future efforts (e.g., new products that reuse code), and to generate revenue by other means (e.g., licensable IP).
In other words, choosing core tech is (still) a big deal.
Hardware a big deal? This runs counter to a dominant idea of convergence: that the underlying hardware is more standardized and commoditized and ultimately software-driven; and that you (software) don't need to dance with what (hardware) brung ya to the bash. In fact, the foregoing principle is quite valid as a predictor - the emergence of host-based (as opposed to DSP-based) media servers (page 14) is a shining example. Yet for each example, a counterexample exists: It's also true that "vitrified signal processing" - the collapse of efficient DSP algorithms into purpose-built ASICs, achieving power-of-ten performance advantages over mathematically identical software-driven models - is expected to be a "real big deal" in the next five years. Because it's cheaper, faster, better.
Looking at core tech is also an opportunity to see how far our industry has come. Today's core technologies are more powerful than ever, better supported, compatible with former and future sibling devices (preserving the value of IP across generations of technology), cheaper, lower-power, faster and more capacious.
Much of this improvement has been deliberate. Core tech makers are very creative, accustomed to leveraging their creativity in concrete and primal ways (e.g., at the moment when sand becomes circuitry). (By contrast, a Windows XP app programmer lives in a cushy padded cell with strictly enforced visiting hours.) So circuit-and-board-and-server folks come up with really cool ideas. Ideas on which whole industries can be founded; and on which "legacy good ideas" can founder. For a good example, check out my semiconductors piece (page 55) for clues about the new magnetic-inductance wireless technology that will obsolete Bluetooth for use in wireless headsets and PANs.
Core tech makers also are exemplary at listening to customers and providing service - each generation of TI DSP, for example, incorporates not only broad improvements for many customers, but selective adaptations for individual customers. And these semi-custom adaptations sometimes have broad - even definitive - appeal, and influence beyond the local situation. Let software flip sections of a DSP on and off, and you beg the question: "How do you engineer software that makes best use of power-cycling?" This ultimately opens up new ways of thinking about software, per se (i.e., "the thermodynamic efficiency of algorithms"). Which leads (or so one intuitively grasps) to discoveries in physics, computer science, and allied fields.
It's a rich mix, and a fascinating study. An appropriate theme for our March issue, which will be distributed at ctexpo (March 4-6, Los Angeles Convention Center). CT Expo started life, more than a decade ago, as a show called "Telecom Developers." And from time to time, if feels good to get back to your roots.
Further informations in englisch can be found on this site!