|
A young DARPA-backed startup with a fresh spin on a low-power computer chip has raised over $100 million in a Series B funding round, a sign of the wild appetite for more energy-efficient ways to build bigger and better AI. The company, EnCharge AI, aims to move AI’s heaviest workloads from big, power-hungry data centers to devices at the edge, including laptops and mobile devices, where energy, size, and cost constraints are tighter. Its approach, known as analog in-memory computing, comes from research that CEO Naveen Verma spun out of his lab at Princeton University, where he’s still a professor of electrical and computer engineering. Verma wouldnt say who its customers are. But in addition to the U.S. Defense Advanced Research Projects Agency (DARPA), which gave it $18.6 million last year, a whos who of industrial, electronics, and defense players are interested in EnCharges chips. The oversubscribed funding round, led by Tiger Global, includes the intelligence communitys investment unit In-Q-Tel, alongside the venture arms of defense giant RTX, power producer Constellation Energy, South Korea’s Samsung, and Taiwan’s Hon Hai (Foxconn). The Santa Clara, California-based startup is also working with semiconductor giant Taiwan Semiconductor (TSMC) to produce its first-generation chips. The new investment brings EnCharge’s total funding to more than $144 million, and will help the 60-person company commercialize its technology, which isnt cheap in the world of semiconductors. “Given the capital intensity of the semiconductor industry, the Series B is an important step for advancing commercialization” of its first chips, Verma tells Fast Company. He declined to disclose the company’s new valuation. EnCharges push comes at a pivotal moment for the AI industry, which is grappling with the fast-growing energy and compute demands driven by a storm of generative AI. The advent of DeepSeek last month has brought new efficiencies and lower costs to AI model training and inference. (It’s unclear if more widespread use of DeepSeek-like models will cancel out those efficiency gains.) But DeepSeek is unlikely to stem the industry’s demand for more compute, more memory, and more energy. EnCharge says that, for a wide range of AI use cases, its specialized chips, or accelerators, require up to 20 times less energy compared to today’s leading AI chips. To make it work, the company relies on a high-wire technique. Rather than using only digital transistors to perform some of the multiplication operations at the heart of AI inferencethe continuous computations that produce chatbot outputsEnCharge’s chips exploit the non-binary wonders of the analog world. “Analog computing isn’t new,” says Verma, “but EnCharge’s specific implementation and system-level approach address many of the fundamental issues that caused previous analog computing approaches to fail.” Finding efficiencies in analog amid the noise Memory access is computings biggest energy hog, and in AI, inference, rather than training, makes up the bulk of most models’ computations. Type a prompt and press enter, and the process of inference begins somewhere in the cloudwhich is to say in hulking data centers where giant clusters of hot, energy-intensive GPUs and CPUs demand massive amounts of electricity and water. Along with the existing environmental costs, the energy required to train and run generative models on these chips is spiking demand on a stretched-thin energy grid. According to the International Energy Agency, a typical request to ChatGPT consumes 10 kilojoules, roughly ten times more than a typical Google search. Memory’s energy demands also mean limits that could slow machine learning progress. Those include the way that, on a chip, the speed of computation is outpacing the bandwidth of memory and communication. Researchers call this problem the von Neumann bottleneck, or the memory wall. EnCharges approach to the challenge is part of a decades-long quest to find efficiencies by placing memory circuits not next to, but inside a computing core, a technique called in-memory compute (IMC). Though it can be tricky to pull off, IMC promises speed-ups by bringing memory and logic closer together and making it far less computationally costly to access memory. This is where the analog computing comes in. Whereas digital devices since the mid-twentieth century operate in a world of on or off, discrete 1s and 0s, analog devices exploit the in-between information of physical phenomenasuch as electrical, mechanical, or hydraulic quantitieswhich allows them to store more data and operate at far lower energy than digital processors. (Quantum computers take the idea to another level, by exploiting the behavior of very, very tiny things.) Because the states in analog devices may be, in the case of EnCharges chip, a continuum of charge levels in a tiny resistive wire, the difference between analog states can be smaller than those between 1 and 0. That requires less energy to switch between values. And that reduces the “data movement costs” between a chip’s memory and compute, says Verma. But, like quantum, analog computing is notoriously noisy and difficult to scale. Verma says EnCharge addresses the accuracy and scalability problems using precise geometry control of its metal wire capacitors, static random-access memory (SRAM) to store the model weights, and a digital architecture that includes a software stack and a compiler to map applications to the chip. “The result is a full-stack architecture that is orders-of-magnitude more energy efficient than currently available or soon-to-be-available leading digital AI chip solutions,” Verma say. “This includes all the digital-to-analog and analog-to-digital conversion overheads, which need to be designed in specialized and integrated ways with the in-memory-computing architecture.” To reduce the costs involved in converting from digital to analog and back, the chip relies on a technique Verma calls “virtualized” IMC. “This involves performing computations directly within a first level of memory, but also by using a memory-system hierarchy, in an analogous way to virtualized memory systems, to enable the computations to efficiently scale to very large AI workloads and models. While traditional architectures face decreasing bandwidth and increasing latency as data size grows,” he wrote, “EnCharges virtualized IMC enhances latency and efficiency when accessing and computing on larger amounts of data,” making it efficient for both small and large language models. Since Verma spun out the research in 2022, the company has been working with customers to refine and derisk its hardware and software designs. The current chipsdiscrete boards on PCIe cardscan run machine-learning algorithms at over 150 tera operations per second (TOPS) per watt, versus 24 TOPS per watt by an equivalent Nvidia chip performing an equivalent task. A newer process to trace finer chip features has allowed the company to triple its energy efficiency metric, to about 650 TOPS per watt. The efficiency breakthrough of EnCharge AIs analog in-memory architecture can be transformative for defense and aerospace use cases where size, weight, and power constraints limit how AI is deployed today, said Dan Ateya, president and managing director of RTX Ventures. Continuing our collaboration with EnCharge AI will help enable AI advancements in environments that were previously inaccessible given the limitations of current processor technology. Dozens of companies are developing new kinds of chips and other architecture to grapple with the energy and computing challenges of AI training and inference. Startups like Cerebras Systems, Samba Nova Systems, and Graphcore, acquired last year by Japans SoftBank, have sought to compete with Nvidia in the AI training market. Cerebras, which sells its giant AI chips and offers services to customers through the cloud, filed paperwork in September to list its shares on the Nasdaq in an initial public offering. In its IPO prospectus, the company said it expects the AI computing market to grow from $131 billion in 2024 to $453 billion in 2027. Other companies are also exploring in-memory analog computing, including startups Mythic, Neo Semiconductor, and Sagence. In a set of new papers, IBM Research scientists also demonstrated advances on analog in-memory computing, including research on a brain-inspired chip architecture for large models, as well as phase-change memory for smaller edge-sized models, and algorithm advances. Analog in-memory computing “could substantially improve the energy efficiency of LLMs by leveraging mixture of experts (MoEs) models,” according to one of the studies, which is featured on the January cover of the journal Nature Computational Science. The Defense Department also continues to pursue analog computing. The Defense Innovation Unit (DIU) on Monday released a solicitation for a “digital engineering platform” to accelerate the design and validation of analog chips, as well as mixed-signal, photonic, and hybrid varieties. [T]he design of these chips is often a bottleneck, with prolonged design cycles and high redesign rates, said the solicitation. Current manual design processes are time-consuming, iterative, and error-prone. Furthermore, the rising costs of prototyping and the shortage of skilled analog designers have created bottlenecks in the development pipeline. The DoD needs a way to accelerate the design process and reduce errors. Russ Klein, the program director of Siemens EDA’s high-level synthesis division, told Semiconductor Engineering in December that if an analog IMC system like EnCharges can effectively scale, it could establish a new energy standard for inference and other high performance computing. The energy savings of not having to move all that data and the parallelism that IMC promises will massively impact not just AI, but any computation that takes place on large arrays of data, he said.
Category:
E-Commerce
In 2018, after imposing steep tariffs on steel and aluminum imports, Donald Trump famously tweeted, Trade wars are good, and easy to win. Trumps time out of the White House has not changed his mind on that subject. Since his inauguration last month, he has set about remaking American trade policy even more dramatically than he did in his first term. Two weeks ago, he imposed across-the-board tariffs against Mexico, Canada, and China, and though he paused the tariffs on Mexico and Canada, theyre still scheduled to go into effect on March 4. This week, he once again imposed heavy tariffs on steel and aluminum imports (those will go into effect on March 12), and while in 2018 he had excluded imports from certain countries from the duties, this time around hes putting the steel and aluminum tariffs on imports from every country in the world. Finally, on Thursday, Trump rolled out a whole new set of import taxes, putting in place a system of reciprocal tariffswhatever the tariff a country imposes on U.S. imports of a product, the U.S. will now impose on imports of that product from that country. These moves arent surprisingTrump loves few things the way he loves tariffs, and appears wholly unconcerned about the fact that tariffs raise prices for both U.S. businesses and U.S. consumers. (As he put it earlier this month, We may have, in the short term, a little pain, and people understand that.) But what is striking, though little-noticed, is that Trump has been able to impose these tariffs unilaterally. Not only has he not consulted with Congress, but he hasnt even had the office of the U.S. Trade Representative make a case for why the tariffs were necessary. In effect, hes raising taxes on imports because he feels like it. Tariff loopholes This isnt something the people who wrote the Constitution ever envisioned happening. In fact, the Constitution does not give the president the power to impose tariffs or make trade policy. Instead, it explicitly gives those powers to Congress alone, awarding it the authority to set duties and imposts (taxes on foreign goods) and to regulate Commerce with foreign Nations. But Trump isnt getting Congress to pass laws imposing these tariffs on foreign importshes doing it by executive order, acting entirely on his own. How is Trump able to do this? By taking advantage of massive loopholes that Congress has created over the past 60 years, delegating much of its power over trade to the president, while taking very little care to limit what the president can do with that power. Trumps legal justification for his steel and aluminum tariffs, for instance, is Section 232 of the Trade Expansion Act of 1962, which allows the president to impose tariffs as high as he wants on specific industries, as long as the Department of Commerce determines that imports in those industries are a threat to national security (a term the law does not define). He justified his across-the-board tariffs on Canada and Mexico by declaring illegal immigration and fentanyl smuggling a national emergency, and then invoking the International Emergency Economic Powers Act of 1977, which gives him the power to impose tariffs during, yes, a national emergency. As for his reciprocal-tariff scheme, Trump will likely rely on Section 301 of the 1974 Trade Act, which allows the president, through the U.S. Trade Representative, to impose tariffs in response to any act, policy, or practice of a foreign country that the USTR finds is unjustifiable or unreasonable (terms the law, again, does not define). A ‘national emergency’ The problem with all of these laws is that the language they use is so vague and ill-defined that they effectively enable the president to do pretty much whatever he wants whenever he wants. Trumps justifications for his policies are in many cases self-evidently ridiculous: Imports of steel from Canada or Australia, for instance, obviously do not threaten American national security, nor is the vanishingly small amount of fentanyl smuggled over the Canadian border every year a national emergency. But federal courts historically have been uninterested in overriding the presidents judgment about what constitutes a national-security threat, or an unjustifiable trade practice, and as a result have basically given the president free rein over trade policy. That has not been a huge problem in the past because presidents have only rarely chosen to impose tariffs unilaterally. When George W. Bush imposed steel tariffs in 2002, for instance, it caused considerable controversy, simply because that kind of action was so unusual. And before Trump, the national security exemption for tariffs had been used primarily to ban oil imports from countries like Iran and Libya (which quite plausibly did pose a threat to national security). Even when presidents did invoke Section 301, it was typically used to negotiate trade settlements through the World Trade Organization. Trump, though, loves tariffs more than any president in recent memory, and is no respecter of norms. So, he has happily exploited the loopholes Congress has left open, creating the situation of permanent uncertainty U.S. businesses and consumers find themselves in today, where we literally do not know if well wake up tomorrow to find a whole new round of import taxes imposed on the stuff we buy. Congress could, of course, fix this problem overnight by simply repealing the laws that have outsourced so much responsibility over trade to the president. The Constitution puts trade policy in Congresss hands for a good reason: Imposing tariffs is almost never something that needs to be done urgently and, like all tax increases, it can and should be done legislatively. Unfortunately, theres been no real support from either party in Congress for the idea of taking back power over trade from the White House. Last fall, Senator Rand Paul did offer a such a bill, one that would have required Congress to approve any tariffs the president wanted to impose. But it went nowhere. Now with Republicanswho, aside from the rare rebel like Paul, have no interest in challenging Trump on his pet issuein charge of both the Senate and the House, theres very little chance of Congress doing anything anytime soon. So we better get used to Trump Imposes New Tariffs headlines: There are going to be a lot of them over the next few years.
Category:
E-Commerce
Jeff Bezos once said, “I like to wander.” That may seem counterintuitive in a business world obsessed with speed, but in a relentless pursuit of momentum, many leaders forget that speed without reflection leads to burnout, inefficiency, and poor decision-making. A report by Asana revealed that nearly 70% of executives say burnout has affected their decision-making ability. The paradox is clear: The faster we try to move without reflection, the more we risk burnout, inefficiency, and short-sighted decision-making. Leaders often mistake pausing for procrastination. However, the reality is that strategic pausing is a high-performance leadership move that separates reactionary decision-makers from visionary leaders. Its not about slowing down indefinitely; its about creating intentional space for recalibration so that when we do move forward, we do so with clarity, focus, and impact. The high cost of constant acceleration We live in an era where agility and rapid execution are prized above all else. But speed without strategy is like driving a high-performance car without brakes; eventually, you crash. Consider what happens when leaders dont pause: Burnout skyrockets: More than 75% of employees experience burnout, and leaders arent immune. Urgency breeds exhaustion. Decisions suffer: Without pauses, leaders react instead of strategizingleading to short-term fixes, not long-term solutions. Innovation stalls: Breakthroughs dont come from busyness. They emerge from reflection, setbacks, and unexpected insights. When leaders dont pause, they burn out, make poor decisions, and stifle innovation. But what if the very thing we fearslowing downis actually the secret weapon for sustainable success? Science backs this up. The science behind slowing down Neuroscience supports the idea that structured reflection enhances cognitive performance and decision-making. Harvard Business School research has shown that leaders who regularly engage in structured reflection improve their productivity and performance by 23%. There are two critical ways slowing down improves leadership effectiveness: It activates diffuse mode thinking. When we take breaks from active problem-solving, our brains process information in the background, leading to creative insights and better solutions. It improves emotional intelligence. Leaders who pause before reacting better navigate difficult conversations, manage conflict, and lead with empathykey traits that drive engagement and retention. Jeff Bezos famously introduced the “Day One” mindset at Amazon, a philosophy that ensures the company never becomes complacent. While Amazon is known for rapid execution, its leadership regularly pauses to reassess its strategic direction. Bezos would take time away from operations to think long-term, a practice that helped Amazon evolve from an online bookstore into a global tech giant. I once worked with a biotech leader whose team was stuck in a cycle of continuous problem-solving, trying to rush a product to market. I encouraged them to step back and ask, “What are we missing?” That moment of intentional pausing led to a breakthrough that fundamentally changed the companys approach and resulted in a novel strategy no one had anticipated. The ‘Slow down to speed up’ framework for leaders How can leaders implement this strategy in their own organizations? Heres a practical framework: Pause with purpose: Book a 30-minute “strategy pause” into your weekly calendar. Treat it like a non-negotiable meeting. Ask better questions: Start your next leadership meeting with a single, high-quality question that shifts the teams thinking: What are we missing? Are we solving the right problem? Whats the long-term impact of this decision? Create space for strategic thinking: Encourage teams to step away from constant execution. Googles 20% Time policy, which allows employees to spend a portion of their workweek exploring new ideas, has led to some of the companys most successful innovations, including Gmail and Google Maps. Embrace rest as a performance strategy: Elite athletes know that recovery is as important as trainingthe same applies to leadership. Leaders who take intentional breaks return with sharper insights and renewed energy. Foster a culture of reflection: Implement a 10-minute debrief ritual after major milestones to extract key lessons. Encourage teams to analyze what worked, what didnt, and what could be improved. Sustainable success isnt about moving the fastest; its about moving with the greatest clarity. The leaders who make space for strategic pauses arent the ones who fall behindtheyre the ones who set the pace for everyone else. Before your next big decision, ask yourself: Am I moving fast just for the sake of moving? Or am I creating the space to move forward with clarity? The difference could define your leadership and your legacy.
Category:
E-Commerce
All news |
||||||||||||||||||
|