The multi-dimensional development path of FPGA
Since FPGAs entered the market in the 1980s, they have coexisted with general-purpose CPUs, ASICs, and even GPUs. The low power consumption, programmability, and modest size of FPGAs have given them a place in the market. This paper analyzes the current situation of communication, HPC, data center and other fields, makes a general analysis of the market, price and comparison of competing products, and predicts some future development directions of FPGA, which provides a good reference for understanding FPGA .
This article summarizes a three-hour discussion at Stanford University in September 2019, which brought together practical experience from a variety of companies and research institutions, including Zilog, Altera, Xilinx, Achronix, Intel, IBM, Stanford, MIT, Berkeley, University of Wisconsin, Technion, Fairchild, Bell Labs, Bigstream, Google, DIGITAL (DEC ), SUN, Nokia, SRI, Hitachi, Silicom, Maxeler Technologies, VMware, Xerox PARC, Cisco, etc. Hope to provide a broader vision and new ideas for the multi-dimensional development of FPGA.
FPGA (Field-Programmable Gate Arrays) has been entangled with the ASIC community since its inception. Ross Freeman and colleagues purchased the technology from Zilog in the mid-1980s to start Xilinx, a company for the ASIC emulation and education markets. Zilog comes from ExxonMobil Oil Company, and its creation grew out of fears in the 1970s that oil would run out within 30 years—though that statement still holds true today. Almost at the same time, Altera with a similar technology at its core was established.
An FPGA is a chip that supports circuit programming, enabling “emulation” of that circuit. For implementations in ASICs, this simulation runs slower than the actual circuit. It is clocked lower and consumes more power, but can be reprogrammed every few hundred milliseconds.
FPGAs are used to emulate ASICs before ASIC manufacturers make masks and submit them to the factory for fabrication. Companies such as Intel and AMD use FPGAs to simulate chips before chip production.
The scramble in the field of telecommunications
FPGAs have been heavily used in the telecommunications industry. As telecommunications standards continue to change, making telecommunications equipment more difficult, companies that are the first to provide telecommunications solutions tend to capture the largest market share. ASICs have a long manufacturing cycle, and FPGAs provide a shortcut. Telecom equipment began to use FPGAs in early versions, which caused FPGA prices to fluctuate. While the ASIC emulation market is not affected by FPGA prices, the price of chips is critical for telecom companies. Years ago, AT&T and Lucent made their own FPGAs, called ORCAs (optimized reconfigurable cell arrays, opTImized reconfigurable cell arrays). But compared to Xilinx or Altera, they are not competitive in terms of silicon speed and size.
Today, Huawei has become the largest customer of FPGAs. U.S.-made FPGAs may be the trigger for recent tensions between the U.S. and China. The chips give Huawei an edge in the delivery of 5G telecom equipment, a two-year lead over any other supplier in the world ready to compete.
FPGA price battle
FPGAs have long been used in SDRs (software-defined radios). SDR technology supports radios of multiple communication standards simultaneously, similar to a telephone that speaks multiple languages. This time FPGAs are in trouble because SDR technology has taken two different paths of adoption. On the one hand, commercial suppliers have developed many solutions based on cost-effectiveness and deployed SDR technology in all base stations on the planet today. In defense, on the other hand, large defense contractors build SDRs to protect lucrative legacy product lines. This has led to high prices for FPGA-based radios, and parts of the U.S. defense market have been resisting their use.
Next, FPGA tried to enter the DSP and embedded market development, and began to introduce some FPGAs using hard-core microprocessors. But the pressure to sell these new FPGAs is so high that customers who reject the new family of chips are blacklisted by chipmakers, sometimes withholding service for months. Given the frequent failures of FPGA companies to conquer new markets, the growth pressure on the FPGA market remains enormous. Because FPGA has a huge chip area and involves a lot of intellectual property rights, it is difficult to reduce the price of FPGA products.
Hit a wall in HPC and the data center
Over the past few years, FPGAs have attempted to develop in the HPC (high performance computing) and data center markets. In 2017, Microsoft announced the use of Altera FPGAs in data centers, and Intel acquired Altera. In 2018, Xilinx announced its “Data Center First” strategy. When its CEO faced the majority of analysts, he declared that Xilinx was no longer a pure FPGA company. It’s dramatic, but it’s a historical necessity.
The main obstacle to using FPGAs in HPC and data centers is place & route, which is the time it takes to run FPGA vendor-specific software to map circuits to FPGA components. For large FPGAs, using fast CPU servers, place and route can take up to three days. And in many cases, the software still cannot find the mapping after three days.
Hit a wall in the oil and gas field
Around 2007, applications in the oil and gas sector formed a niche market. The time it takes to simulate drilling the earth to find oil on a conventional computer is longer than the actual construction and drilling in the field. The use of FPGA accelerators dramatically changes this time-consuming inversion. The first FPGA for an oil company’s data center to compute seismic images was manufactured by Maxeler Technologies and delivered to Chevron.
The use of FPGAs in oil and gas expanded over the years until pressure from the ASIC industry brought standard CPU technology back. Today, prediction and simulation are still important in the oil and gas field, and seismic imaging is mostly done using CPUs and GPUs, but FPGAs still have a place. We know that “the current new things will become tomorrow’s flowers”. Of course, artificial intelligence and a focus on data are new things these days.
Still, FPGAs are a shortcut to market, an easy way to gain a competitive advantage, and an essential technology in many mission-critical situations. FPGAs are more expensive per chip than ASICs, but for HPC and data centers, they require fewer FPGA chips and lower cooling overhead than CPUs and GPUs, so FPGAs are significantly less expensive to run than CPUs. Or run the software on the GPU. FPGAs make data centers smaller, which can make operators uncomfortable because they worry that their data centers may shrink.
ASIC vs FPGA
Another use for FPGAs is as a complement to ASICs. While ASICs are built for fixed functionality, adding FPGAs provides some flexibility for new product changes and adaptation to different markets.
Modern FPGAs integrate more and more hard core functions and become more and more like ASICs. ASICs also often add some FPGA fabric to the design to facilitate debugging, testing, field repair, and the flexibility to add small features.
But the ASIC team has been struggling with the FPGA concept. The ASIC designer asks “what functionality does the user want?” and gets impatient with the “I’m not sure” answer.
The driverless car industry is one such new battleground. As algorithms are constantly changing, and laws and regulations may change as cars enter the field, driving techniques need to be constantly adjusted accordingly, which requires flexible and variable solutions. FPGAs have lower clock frequencies, smaller heatsinks, and are physically smaller than CPUs and GPUs. Lower power consumption and smaller size make FPGAs the obvious choice. Still, GPUs are easier to program and don’t require three days of place and route.
Another critical consideration is the need to run the same code on the car and in the cloud for things like simulation and testing. This requires the FPGA to be available in the cloud before it can be used in the car. Because of the above issues, many developers prefer to opt for GPUs.
FPGA Evolution
FPGAs are in constant evolution. Modern interfaces are making FPGAs easier to program, more modular, and easier to work with other technologies. FPGAs support the AXI (Advanced Extensible Interface) bus, which makes them easier to program, but also introduces a lot of serious efficiency losses that reduce the performance of the FPGA and ultimately make it less competitive. Some academic work has proposed studies addressing routing problems, such as Eric Chung’s paper on dynamic networks in FPGAs, but these advanced concepts have not yet been accepted by industry.
How is the FPGA connected? For HPC workloads with high data flow, PCI Express can be used and communication concealment techniques can be deployed. But what about small-scale workloads like NFV (network funcTIon virtualizaTIon) that serve a large number of users simultaneously? Recent findings from VMware indicate that for NFV and virtual machine acceleration, the FPGA must be directly connected to the CPU and use cache coherence as the communication mechanism. Of course, a key feature is that the crash of the FPGA does not cause the CPU to crash and vice versa. Large technology companies are re-examining the requirements of the IBM mainframe era, intending to use standardized platforms to cover increasing complexity.
Opportunities also exist in the mass enterprise market. When offering an FPGA platform, companies without the budget for ASIC development and without understanding the latest silicon manufacturing challenges and solutions can develop circuits and build a competitive advantage in their products. For example, the emerging Internet of Things (IoT) edge computing enables computing near sensors, displays, and even as data streams pass by.
At the same time, FPGA companies are pushing the technology stack up to the CPU socket. Intel dominates this market with technologies such as NFV special instructions. The main barriers to adding new CPUs and FPGAs in the data center are not only speed and cost, but also the availability of software and drivers for all possible I/O devices.
The key to implementing FPGAs in the data center is ease of use. For example, use automated tools to drive FPGA applications and avoid placement and routing challenges. Microsoft pioneered the use of FPGAs in large data centers to accelerate Bing, NFV, and artificial intelligence algorithms, in addition to building abstractions, domain-specific languages, and flexible hardware infrastructure. Commercially, the main problem with FPGAs is the go-to-market strategy.
It’s too late to think about software after building a new chip. How to adapt hardware to software to benefit from existing software? This also provides an opportunity to rethink the FPGA architecture. But be warned: the silicon industry is a gold-sucking beast. Building an ASIC is a poker game where the stakes have been climbing over the years. It’s a winner-take-all contest that removes the FPGA threat early in the game.
FPGA Niche Market
As software designers often say, “what software can do, it should be done by software”. ASIC designers would say, “What an ASIC can do, an ASIC should do.” The funniest statement is, “If it can be done in software, then you don’t have to deal with all the FPGA-minded people.” Compared to ASICs The size of the team, and the size of the software developers worldwide, the FPGA companies are small and the community is small, with only a few even eccentric programmers.
FPGA performance may be faster than CPUs and GPUs, but the real lesson from the industry and investment community is that for the vast majority of the time since the advent of computers, speed and real-time were not that important. Few people buy a computer just for high performance. Although this happens from time to time, it is impossible to build a business market based on such random events. Additionally, FPGAs lack standards, open source code, and a pleasing programming model.
Therefore, there is no standard market support for FPGA programs that work on all FPGA chips or are easily cross-compiled. Maxeler Technologies has advanced solutions that provide such interfaces, but broad industry adoption requires trust. Trust is what drives technology from being a plaything for early adopters to benefiting all, but it needs to be driven and supported by existing vendors in the data center space.
In reality, the user of the application will say, “I don’t care about the specific method, as long as it does what I want to do.” In the application areas that have not been widely explored, what are the FPGAs that can play a role? For real-time computing, FPGAs are used in industry. For computer vision on drones, FPGAs have advantages in weight and power consumption. Hardware upgrades on satellites are expensive, and FPGAs provide critical long-term flexibility for this. FPGAs need products that share weal and woe. Such products must be easy to program, not just hardware or software, but also ecosystems, complete solutions.
Just-in-time compilation and automatic FPGA program generation are a great way to expand the limits of the current market. It’s easier said than done, but with the breakthrough of artificial intelligence in the application space, more and more opportunities have emerged. At present, everything can be done by artificial intelligence, and even traditional algorithms such as seismic imaging in the oil and gas field use artificial intelligence. Handling AI modules requires scientific and engineering solutions. FPGAs can provide a good starting point, starting with connecting AI blocks and then integrating into the FPGA fabric. For example, Xilinx’s next-generation chips combine the AI architecture, CPU, 100G interface and FPGA unit into the same 7-nanometer chip.
From another perspective, as AI chips generate and process large amounts of data, FPGAs are required to provide a lot of input and quickly take output. FPGAs will play a big role in AI chip companies as new ASICs for AI processing become available.
FPGA Development Forecast
Another use for FPGAs is as a complement to ASICs. While ASICs are built for fixed functionality, adding FPGAs provides some flexibility for new product changes and adaptation to different markets.
Modern FPGAs integrate more and more hard core functions and become more and more like ASICs. ASICs also often add some FPGA fabric to the design to facilitate debugging, testing, field repair, and the flexibility to add small features.
But the ASIC team has been struggling with the FPGA concept. The ASIC designer asks “what functionality does the user want?” and gets impatient with the “I’m not sure” answer.
The driverless car industry is one such new battleground. As algorithms are constantly changing, and laws and regulations may change as cars enter the field, driving techniques need to be constantly adjusted accordingly, which requires flexible and variable solutions. FPGAs have lower clock frequencies, smaller heatsinks, and are physically smaller than CPUs and GPUs. Lower power consumption and smaller size make FPGAs the obvious choice. Still, GPUs are easier to program and don’t require three days of place and route.
Another critical consideration is the need to run the same code on the car and in the cloud for things like simulation and testing. This requires the FPGA to be available in the cloud before it can be used in the car. Because of the above issues, many developers prefer to opt for GPUs.
- There will be successful CPU+FPGA server chips, or FPGAs with direct access to the CPU cache hierarchy
- SoC (system on a chip) FPGA chips will continue to grow, driving industries such as medical, next-generation telecommunications, and automotive.
- Developers will use FPGAs to do amazing things and move the world forward, but the fact that there are FPGAs inside must be kept secret.
- The FPGA name will remain, and chips called FPGAs will continue to appear, but inside will be very different.
- Once we give up (dataflow) optimizations to simplify FPGA programming, the performance of the FPGA will degrade and it will not be able to compete with the easy-to-program CPU.
- FPGAs will have dynamic routing, evolving interconnects, and flexible data movement at runtime
- Like the complete software stack on top of the FPGA, the place-and-route software will be open source. Yosys and LatTIce FPGAs are already working on this.
- All semiconductor architectures will be combined into a single chip that combines TPU, GPU, CPU, ASIC and FPGA. All technologies can be integrated in the chip, and some technologies can also be integrated.
- More chips will be focused on specific application spaces, and only a few will be general purpose. In a sense, everything will be a SoC.
Haoxinshengic is a pprofessional FPGA and IC chip supplier in China. We have more than 15 years in this field。 If you need chips or other electronic components and other products, please contact us in time. We have an ultra-high cost performance spot chip supply and look forward to cooperating with you.
If you want to know more about FPGA or want to purchase related chip products, please contact our senior technical experts, we will answer relevant questions for you as soon as possible
Our Products