Loading ...
Sorry, an error occurred while loading the content.

Re: [bafuture] Where we are now

Expand Messages
  • Wayne Radinsky
    On Fri, 21 May 2004 10:14:51 -0700 (PDT), Troy Gardner ... Ok, historically, the answer would be no, FPGA s are not as generic as you can get. FPGA s -- field
    Message 1 of 5 , May 21, 2004
    • 0 Attachment
      On Fri, 21 May 2004 10:14:51 -0700 (PDT), Troy Gardner
      <thegreyman@...> wrote:
      > Aren't FPGA's pretty much as generic as you can get? They care flexible enough
      > to pretty much do many. I know that JPL and other robotics people use them for
      > several reasons, to reroute around damaged circuits from solar radiation, to
      > change on the fly for processing (devote 100% of the processors to
      > vision/spectral analysis one moment, then in the next 80% to broadcasting
      > signals). Several of the ones from Xylinx have (several!) PowerPC cores on the
      > same die to do the floating point processing to get the best of both worlds.

      Ok, historically, the answer would be no, FPGA's are not as
      generic as you can get. FPGA's -- field programmable gate
      arrays -- are limited by the number and arrangement of the
      gate arrays. This is why the programmable logic device
      market has been divided into FPGA's and CPLD's -- the
      FPGA's handle the simpler programming needs and the CPLD's
      -- complex programmable logic devices -- handle field
      programming needs that extend beyond the capability of an
      FPGA. Most of the profit of the companies that sell
      programmable logic chips, such as Xilinx, Altera, Lattice
      Semiconductor, IBM Microelectronics, Agere, etc, comes
      from the CPLD's, not the FPGA's, because the CPLD's have
      significantly higher profit margins.

      Today, since as you point out, modern FPGA's do have
      embedded microprocessors, which are by definition
      Turing-complete, I would assume that it's possible to make
      full Turing machines out of FPGA's. So we're looking at
      different cost/benefit trade-off points. With an
      application specific integrated circuits (ASIC) design you
      get the best performance at the highest cost. Then CPLD,
      then FPGA, then down to your general-purpose CPU's which
      give you the lowest performance at the lowest cost, but
      also with the greatest flexibility.

      The point I wanted to make is that the pressure to sell
      fewer chips at higher volumes actually accelerates this
      trend -- the trend to use the more general purpose chips.
      So we should expect ASIC designs to go to PLD's and PLD
      designs to become software.

      So the AMD guy has a point, that everything heads towards
      software running on general-purpose hardware, and the low
      end tends to advance upward and usurp everything in its wake.

      > I haven't figured out what degree of this software on FPGA's is human designed,
      > vers assisted, or evolved/evolvable in the future. As I also wonder how much of
      > the 'black arts' of physics is just a matter of time before it become a
      > simulated, like Machinma for particle physics. Design, and innovation is hard,
      > but not outside the realm of brute force (e.g. Kasparov vrs IBM)

      I'm not sure what you are getting at here. Why wouldn't
      software, including software written for FPGA's rather than
      CPU's, be machine-designed in the future? At the present
      time the problem is that if you sell EDA software that
      solves a complex problem related to semiconductor physics,
      but the *user* doesn't understand the physics and doesn't
      understand how the software solves the problem, then they
      can't effectively use the software to design circuits.

      Wayne
    • Wayne Radinsky
      So this is a bit interesting. Which company does Wall Street like better, Xilinx compared with Altera?
      Message 2 of 5 , Jun 6, 2004
      • 0 Attachment
        So this is a bit interesting. Which company does Wall
        Street like better, Xilinx compared with Altera?

        http://finance.yahoo.com/q/bc?t=1y&s=XLNX&l=on&z=m&q=l&c=ALTR

        That's for the last year. A two horse race, neck and neck.
        What if you look beyond the last year?

        http://finance.yahoo.com/q/bc?s=XLNX&t=my&l=on&z=m&q=l&c=ALTR

        Coincidence? So normally I expect high tech markets to
        consolidate, with a gorilla dominating the marketplace.
        Does the PLD industry have two gorillas? Now if I were
        making bets, I would bet on Xilinx because they are first
        to use the 300mm, 90-nm technology, while Altera is staying
        behind for a while at 130nm. That is, if you believe one
        gorilla is going to dominate the other, rather than staying
        a two-gorilla race. Uh, wait, wasn't it supposed to be two
        horses?

        Now Troy brought up the issue (below) of Microprocessors vs
        programmable logic devices like Xilinx and Altera.

        So now I compare with Intel, we see the PLD guys doing
        slightly better (slightly here could be random noise) than
        the microprocessor. I expected the microprocessor to win,
        given the time frame.

        http://finance.yahoo.com/q/bc?t=my&s=XLNX&l=on&z=m&q=l&c=INTC

        Remember, past returns are no guarantee of future performance :)

        There's no question today that programmable logic is
        stealing business away from the application specific IC
        (ASIC) business. So, the real question is, are
        microprocessors (Intel, AMD etc) on a collision course with
        PLD (Xilinx, Altera etc)? The PLD way is to use embedded
        processors and other System on Chip (SoC) components, then
        add your custom logic gates for . The Microprocessor way is
        to have a general purpose processor and do everything else
        in software -- but over time, you add new CPU instructions
        for common tasks, such as digital signal processing (DSP).
        Who is going to be the winner?


        On Fri, 21 May 2004 14:52:03 -0700, Wayne Radinsky <waynerad@...> wrote:
        >
        > On Fri, 21 May 2004 10:14:51 -0700 (PDT), Troy Gardner
        > <thegreyman@...> wrote:
        > > Aren't FPGA's pretty much as generic as you can get? They care flexible enough
        > > to pretty much do many. I know that JPL and other robotics people use them for
        > > several reasons, to reroute around damaged circuits from solar radiation, to
        > > change on the fly for processing (devote 100% of the processors to
        > > vision/spectral analysis one moment, then in the next 80% to broadcasting
        > > signals). Several of the ones from Xylinx have (several!) PowerPC cores on the
        > > same die to do the floating point processing to get the best of both worlds.
        >
        > Ok, historically, the answer would be no, FPGA's are not as
        > generic as you can get. FPGA's -- field programmable gate
        > arrays -- are limited by the number and arrangement of the
        > gate arrays. This is why the programmable logic device
        > market has been divided into FPGA's and CPLD's -- the
        > FPGA's handle the simpler programming needs and the CPLD's
        > -- complex programmable logic devices -- handle field
        > programming needs that extend beyond the capability of an
        > FPGA. Most of the profit of the companies that sell
        > programmable logic chips, such as Xilinx, Altera, Lattice
        > Semiconductor, IBM Microelectronics, Agere, etc, comes
        > from the CPLD's, not the FPGA's, because the CPLD's have
        > significantly higher profit margins.
        >
        > Today, since as you point out, modern FPGA's do have
        > embedded microprocessors, which are by definition
        > Turing-complete, I would assume that it's possible to make
        > full Turing machines out of FPGA's. So we're looking at
        > different cost/benefit trade-off points. With an
        > application specific integrated circuits (ASIC) design you
        > get the best performance at the highest cost. Then CPLD,
        > then FPGA, then down to your general-purpose CPU's which
        > give you the lowest performance at the lowest cost, but
        > also with the greatest flexibility.
        >
        > The point I wanted to make is that the pressure to sell
        > fewer chips at higher volumes actually accelerates this
        > trend -- the trend to use the more general purpose chips.
        > So we should expect ASIC designs to go to PLD's and PLD
        > designs to become software.
        >
        > So the AMD guy has a point, that everything heads towards
        > software running on general-purpose hardware, and the low
        > end tends to advance upward and usurp everything in its wake.
        >
        > > I haven't figured out what degree of this software on FPGA's is human designed,
        > > vers assisted, or evolved/evolvable in the future. As I also wonder how much of
        > > the 'black arts' of physics is just a matter of time before it become a
        > > simulated, like Machinma for particle physics. Design, and innovation is hard,
        > > but not outside the realm of brute force (e.g. Kasparov vrs IBM)
        >
        > I'm not sure what you are getting at here. Why wouldn't
        > software, including software written for FPGA's rather than
        > CPU's, be machine-designed in the future? At the present
        > time the problem is that if you sell EDA software that
        > solves a complex problem related to semiconductor physics,
        > but the *user* doesn't understand the physics and doesn't
        > understand how the software solves the problem, then they
        > can't effectively use the software to design circuits.
        >
        > Wayne
        >
      • Wayne Radinsky
        Another article on EDA. This one is more readable by normal humans. EDA Needs Quantum Shift The Intel CTO suggested that the industry must completely change
        Message 3 of 5 , Jun 15, 2004
        • 0 Attachment
          Another article on EDA. This one is more readable by normal humans.

          EDA Needs Quantum Shift
          The Intel CTO suggested that the industry must completely
          change the way it looks at EDA in order to create the tools
          that will continue to scale chips down.
          http://www.reed-electronics.com/electronicnews/article/CA424258?spacedesc=news
        Your message has been successfully submitted and would be delivered to recipients shortly.