A Tale of Two Embeddeds

  • Page 1 of 1
    Bookmark and Share

Attending an embedded-focused trade show is always an interesting experience. For those of us who cover the broad embedded market, you’ll have to excuse us if we are unable to check our schizophrenia at the door. For the embedded market today makes the Two Faces of Eve look like a minor childhood setback. There are truly two faces of the 32-bit / 64-bit embedded market, and never the twain shall meet.

In one corner we have the down and dirty embedded PC market, characterized by x86 notebook /netbook processors on a broad set of off-the-shelf single board computers and computer-on-module products in a variety of industry standard form factors. In this segment, hardware design is focused on system level issues and integration of diverse components. Actual board design is limited to the specialized I/O needed to implement the application. Design of a custom CPU board, let alone a custom CPU chip, is reserved for the lunatic fringe. It’s an off-the-shelf market.

Software design focuses on application development. Customization and configuration of the operating system is unnecessary because the underlying architecture is locked down to PC standards implemented religiously in processors and chipsets. Windows is king, with Linux taking a large share of the former DOS market. Software designers in this space have no use for an in-circuit emulator (ICE), much less a software simulation tool of any kind. Why do this when you can run your application full speed on your desktop PC with outstanding but cheap debug tools at your immediate disposal. This is self-hosted development. Let’s admit that this segment is not the place for applications with any kind of real-time (read “deterministic”) requirements that would be buried under an avalanche of variable clock cycle execution times, instruction pipelines and caches in that huge CISC machine.  

On the flip side, we have tiny, power-efficient RISC cores available in off-the-shelf chips or in the form of cores for the hearty do-it-yourselfer to roll his or her own CPU. Every implementation targets an application and is therefore different. Chip design skills, FPGA design, or at least a solid foundation in system design are a must on the hardware side. Thorough, accurate simulation is essential as the cost of error is enormous. Bus considerations focus more on the on-chip interfaces like ARM’s AMBA than the off-chip interconnects such as PCI Express found in that “other” market. On a good day, there might be an off-chip 16-bit local bus.  

And the software side is as different as night and day. Even the names of the major operating systems serving this segment are Greek to the x86 designer: VxWorks, QNX, Integrity, Nucleus, Thread-X, OSe and others. The very first software step is to create a board support package (BSP) or basic driver set for that custom CPU implementation—a task that requires very different skills from application development and can only be found on BIOS developers in the x86 world. Debugging, of course, requires extremely effective simulation cross-development (host-target) and remote debugging via a JTAG ICE.  Determinism, or hard real time, is the name of the game here.

Over the years, there have been a few efforts at cross-fertilization between these diverse segments. Even now, just a few RISC-based SFF boards are available off-the-shelf in non-standard form factors. But a veteran x86 designer would face a brave new world of simulation, cross compiling, remote debugging, and tools and operating systems with a few more digits after the dollar sign.

Within the RISC world, there has been little penetration by x86 processors or look-alike cores in spite of the best efforts of the major proponents of x86 architecture. Even off-the-shelf small form factor SBCs and COMs are few and far between. Having spent time in the late 90s trying to engineer just such a crossover, we can tell you it feels like trying to fit Shaquille O’Neal with a size six pump. You can squeeze like crazy, but even if the shoe fits, you won’t be able to walk.

So what conclusions can we draw from these diverse, zero overlap embedded market segments? Neither one could replace the other. Crossover is difficult, if not impossible. Even moving people from companies serving one space to the other is fraught with danger. Both serve specific applications with requirements that rarely overlap. Both are needed. Both are necessary. Both meet their respective application requirements and carry their own set of challenges for the future. And let’s leave it at that.