Gary Smith EDA Gary Smith EDA (GSEDA) is the leading provider of market intelligence and advisory services for the global Electronic Design Automation (EDA), Electronic System Level (ESL) design, and related technology markets.


  • DOWNLOAD PDF
    "Threads are Dead"

    Threads are Dead

    • DAC always makes you walk away with many impressions. Sometimes one stands out more than the others. Last year we were happy to hear that “It’s the Software Stupid” resonated with many engineers. This year the one that stuck in my mind was a statement made by Professor Wen-mei Hwu from the University of Illinois at Urbana–Champaign, during the Thousand-Core Chips panel. He said, “Threads are dead.”

      For those that haven’t been following the move to Multi-Core, Multi-Processing there are two issues that affect the EDA community. First is the design issue; this year the cost of software development, for an SoC, is passing the cost of the actual IC design. This is actually a major opportunity as the designers are turning to the EDA vendors to solve the problem.

      The second issue is far more of a problem than an opportunity. EDA software is on the leading edge of applications software development technology. We develop some of the most sophisticated and performance demanding applications in the world. Today that leading edge is shifting from the dominant Von Neumann computing model to multi-core, multi-processing parallel computing. Highly specialized supercomputing is going main-stream and EDA is on the leading edge of that shift. So what are we going to do about it?

      Not that we haven’t been playing around with parallel computing. Quite a few EDA tools, especially in the IC CAD area have the capability to do some level of parallelism. In the past that was a competitive advantage, once you hit 100 million gates it becomes a competitive imperative, the price of admission to the 45nm market. Unfortunately most of the tools feature some flavor of threading. Now that works well in Embarrassingly Parallel Programs, and fortunately we have some of those, but in most applications you are pretty much restricted to four threads. Some of the best and the brightest of the developers can do six threads but they are few and far between. You might remember the VLIW (Very Large Instruction Word) craze in the 1990s. Unfortunately the semiconductor companies found that that approach severely limited their market; you just couldn’t find enough VLIW programmers. Today the fact that a processor is VLIW is usually listed in the fine print of the spec sheet, and a lot of work has been done to shield the programmer from the parallel programming issues inherent in VLIW architectures. Of course there is the nondeterministic feature of threading. The fact that threaded programs are almost unverifiable is a nuisance in EDA tool development but it is a critical issue in Embedded multi-core programming; and yes software engineers are entering the wonderful world of verification. That of course is why Professor Hwu pronounced threads dead.

      Therefore what we are moving to today is API or Library based parallel programming. That way you can add parallel constructs while still using the C language. At this point you have to start buying new debuggers and possibly new compilers. Now we’re starting to switch from our present Von Neumann programming infrastructure to a concurrent programming infrastructure.

      Next up on the list is to start with a concurrent algorithm to begin with. There are some interesting start-ups that are presently looking at this area. In fact one of the surprises has been the surge in the use of FORTRAN for algorithmic development. It seems that in 1995 some very useful concurrent constructs were added to the language. You add that to the fact that FORTRAN uses arrays rather than threads, making it a deterministic language, and you have a resurgence of a very old warhorse.

      Of course the best of all would be a concurrent language with all of the programming infrastructure needed for parallel architectures. DARPA did give out a contract to IBM, Sun and Cray for just that reason. The question is who will use this language? If you look at the Software Engineering community you are talking about a group comparable in size to the Design Engineering community. A switchover of that magnitude doesn’t sound that hard, especially when you consider that using C, and its accompanying infrastructure, is causing design costs to go through the ceiling. But what about all the programmers? That is a big number. Well there is always F# ;->

      Gary Smith





join mailing list join joke list