Xorte logo

News Markets Groups

USA | Europe | Asia | World| Stocks | Commodities



Add a new RSS channel

 
 


Keywords

2025-02-14 15:25:00| Fast Company

The Treasury Department’s Office of Inspector General on Friday said it was launching an audit of the security controls for the federal government’s payment system, after Democratic senators raised red flags about the access provided to Trump aide Elon Musk’s Department of Government Efficiency team. The audit will also review the past two years of the system’s transactions as it relates to Musk’s assertion of alleged fraudulent payments, according to a letter from Loren J. Sciurba, Treasurys deputy inspector general, that was obtained by The Associated Press. The audit marks part of the broader effort led by Democratic lawmakers and federal employee unions to provide transparency and accountability about DOGE’s activities under President Donald Trump’s Republican administration. The Musk team has pushed for access to the government’s computer systems and sought to remove tens of thousands of federal workers. We expect to begin our fieldwork immediately, Sciburba wrote. Given the breadth of this effort, the audit will likely not be completed until August; however, we recognize the danger that improper access or inadequate controls can pose to the integrity of sensitive payment systems. As such, if critical issues come to light before that time, we will issue interim updates and reports. Tech billionaire Musk, who continues to control Tesla, X and SpaceX among other companies, claims to be finding waste, fraud and abuse while providing savings to taxpayers, many of his claims so far unsubstantiated. But there is a risk that his team’s aggressive efforts could lead to the failure of government computer systems and enable Musk and his partners to profit off private information maintained by the government. Democratic Sens. Elizabeth Warren of Massachusetts and Ron Wyden of Oregon led the push for the inspector general office’s inquiry. On Wednesday, Warren, Wyden and Sen. Jack Reed, D-Rhode Island, sent a letter to Treasury Secretary Scott Bessent noting the inconsistencies in the accounts provided by his department about DOGE. Your lack of candor about these events is deeply troubling given the threats to the economy and the public from DOGEs meddling, and you need to provide a clear, complete, and public accounting of who accessed the systems, what they were doing, and why they were doing it, the Democratic lawmakers wrote in their letter. The Treasury Department provided conflicting information about DOGE’s access to the payment system. Initially, it claimed the access was read only, only to then acknowledge that a DOGE team member briefly had the ability to edit code, and then to say in an employee sworn statement that the ability to edit was granted by accident. The 25-year-old employee granted the access, Marko Elez, resigned this month after racist posts were discovered on one of his social media accounts, only for Musk to call for his rehiring with the backing of Trump and Vice President JD Vance. Advocacy groups and labor unions have filed lawsuits over DOGEs potential unauthorized access to sensitive Treasury payment systems, and five former treasury secretaries have sounded the alarm on the risks associated with Musks DOGE accessing sensitive Treasury Department payment systems and potentially stopping congressionally authorized payments. Earlier this week, the Treasury declined to brief a pair of the highest-ranking lawmakers on the Senate Finance Committee, including Wyden, on the ongoing controversy related to DOGE’s use of Treasury payment systems, citing ongoing litigation. By FATIMA HUSSEIN and JOSH BOAK Associated Press


Category: E-Commerce

 

LATEST NEWS

2025-02-14 14:20:00| Fast Company

In a move on the petty-but-harmful bar of shutting down the White House Spanish-language page and the removal of words such as diversity and fairness from the FBI core values, the National Park Service (NPS) has erased references to transgender and queer people from the official website for the Stonewall National Monument due to a executive order issued by President Donald Trump.  A two-gender nation? Last month, the president declared that the federal government will only recognize two gendersmale and female. Since then, federal agencies have been removing references to trans, queer, and intersex people from their pages, including those of the the NPS, which has erased references to transgender and queer people from its official website for Stonewall. LGBTQ+ is now also shortened to LGB on the NPS website. According to an archived version of the page posted by CNN, it had previously said “LGBTQ+.” The organization has not yet responded to our request for comment.   Honoring brave pioneers The Stonewall National Monument Visitor Center opened June 28, 2024, marking the 55th anniversary of the of the Stonewall Rebellion, where LGBTQ+ activistsincluding trans peoplekicked off the modern fight for equality. A program of Pride Live, it was the first LGBTQIA+ visitor center in the National Park Service. In response to Trumps tactic, Pride Live and the visitor center published a statement on their websites denouncing the move: “Our space is inextricably linked with and honors the brave pioneers, especially transgender and gender-nonconforming individuals, who led the Stonewall Rebellion,” the statement read. “Independently owned and operated, and 100% supported by donations, we will continue our mission to ensure that every person has access to learn about and see themselves in history.” Erik Bottcher, a member of the New York City Council, told CNN: Hes trying to cleave our community apart and divide us. Hes not going to succeed. Lesbians and gays are not going abandon our transgender siblings. We are one community.  Pride Live has made clear it will stand against the erasure of the existence and contributions of trans and queer people from the narrative: Through the creation of the Stonewall National Monument Visitor Center, were unwavering in our effort to protect and preserve Stonewalls legacy and history, the organization said. Government websites have been moving to erase references to transgender and nonbinary communities since Trump retook the White House. Last month, the U.S. State Department updated its page for queer travelers, which now says “LGB Travelers.”


Category: E-Commerce

 

2025-02-14 14:13:00| Fast Company

A young DARPA-backed startup with a fresh spin on a low-power computer chip has raised over $100 million in a Series B funding round, a sign of the wild appetite for more energy-efficient ways to build bigger and better AI.  The company, EnCharge AI, aims to move AI’s heaviest workloads from big, power-hungry data centers to devices at the edge, including laptops and mobile devices, where energy, size, and cost constraints are tighter. Its approach, known as analog in-memory computing, comes from research that CEO Naveen Verma spun out of his lab at Princeton University, where he’s still a professor of electrical and computer engineering.  Verma wouldnt say who its customers are. But in addition to the U.S. Defense Advanced Research Projects Agency (DARPA), which gave it $18.6 million last year, a whos who of industrial, electronics, and defense players are interested in EnCharges chips. The oversubscribed funding round, led by Tiger Global, includes the intelligence communitys investment unit In-Q-Tel, alongside the venture arms of defense giant RTX, power producer Constellation Energy, South Korea’s Samsung, and Taiwan’s Hon Hai (Foxconn). The Santa Clara, California-based startup is also working with semiconductor giant Taiwan Semiconductor (TSMC) to produce its first-generation chips.  The new investment brings EnCharge’s total funding to more than $144 million, and will help the 60-person company commercialize its technology, which isnt cheap in the world of semiconductors.  “Given the capital intensity of the semiconductor industry, the Series B is an important step for advancing commercialization” of its first chips, Verma tells Fast Company. He declined to disclose the company’s new valuation. EnCharges push comes at a pivotal moment for the AI industry, which is grappling with the fast-growing energy and compute demands driven by a storm of generative AI. The advent of DeepSeek last month has brought new efficiencies and lower costs to AI model training and inference. (It’s unclear if more widespread use of DeepSeek-like models will cancel out those efficiency gains.) But DeepSeek is unlikely to stem the industry’s demand for more compute, more memory, and more energy. EnCharge says that, for a wide range of AI use cases, its specialized chips, or accelerators, require up to 20 times less energy compared to today’s leading AI chips. To make it work, the company relies on a high-wire technique. Rather than using only digital transistors to perform some of the multiplication operations at the heart of AI inferencethe continuous computations that produce chatbot outputsEnCharge’s chips exploit the non-binary wonders of the analog world. “Analog computing isn’t new,” says Verma, “but EnCharge’s specific implementation and system-level approach address many of the fundamental issues that caused previous analog computing approaches to fail.” Finding efficiencies in analog amid the noise  Memory access is computings biggest energy hog, and in AI, inference, rather than training, makes up the bulk of most models’ computations. Type a prompt and press enter, and the process of inference begins somewhere in the cloudwhich is to say in hulking data centers where giant clusters of hot, energy-intensive GPUs and CPUs demand massive amounts of electricity and water.  Along with the existing environmental costs, the energy required to train and run generative models on these chips is spiking demand on a stretched-thin energy grid. According to the International Energy Agency, a typical request to ChatGPT consumes 10 kilojoules, roughly ten times more than a typical Google search. Memory’s energy demands also mean limits that could slow machine learning progress. Those include the way that, on a chip, the speed of computation is outpacing the bandwidth of memory and communication. Researchers call this problem the von Neumann bottleneck, or the memory wall. EnCharges approach to the challenge is part of a decades-long quest to find efficiencies by placing memory circuits not next to, but inside a computing core, a technique called in-memory compute (IMC). Though it can be tricky to pull off, IMC promises speed-ups by bringing memory and logic closer together and making it far less computationally costly to access memory.  This is where the analog computing comes in. Whereas digital devices since the mid-twentieth century operate in a world of on or off, discrete 1s and 0s, analog devices exploit the in-between information of physical phenomenasuch as electrical, mechanical, or hydraulic quantitieswhich allows them to store more data and operate at far lower energy than digital processors. (Quantum computers take the idea to another level, by exploiting the behavior of very, very tiny things.) Because the states in analog devices may be, in the case of EnCharges chip, a continuum of charge levels in a tiny resistive wire, the difference between analog states can be smaller than those between 1 and 0. That requires less energy to switch between values. And that reduces the “data movement costs” between a chip’s memory and compute, says Verma. But, like quantum, analog computing is notoriously noisy and difficult to scale. Verma says EnCharge addresses the accuracy and scalability problems using precise geometry control of its metal wire capacitors, static random-access memory (SRAM) to store the model weights, and a digital architecture that includes a software stack and a compiler to map applications to the chip.  “The result is a full-stack architecture that is orders-of-magnitude more energy efficient than currently available or soon-to-be-available leading digital AI chip solutions,” Verma say. “This includes all the digital-to-analog and analog-to-digital conversion overheads, which need to be designed in specialized and integrated ways with the in-memory-computing architecture.” To reduce the costs involved in converting from digital to analog and back, the chip relies on a technique Verma calls “virtualized” IMC. “This involves performing computations directly within a first level of memory, but also by using a memory-system hierarchy, in an analogous way to virtualized memory systems, to enable the computations to efficiently scale to very large AI workloads and models. While traditional architectures face decreasing bandwidth and increasing latency as data size grows,” he wrote, “EnCharges virtualized IMC enhances latency and efficiency when accessing and computing on larger amounts of data,” making it efficient for both small and large language models. Since Verma spun out the research in 2022, the company has been working with customers to refine and derisk its hardware and software designs. The current chipsdiscrete boards on PCIe cardscan run machine-learning algorithms at over 150 tera operations per second (TOPS) per watt, versus 24 TOPS per watt by an equivalent Nvidia chip performing an equivalent task. A newer process to trace finer chip features has allowed the company to triple its energy efficiency metric, to about 650 TOPS per watt.  The efficiency breakthrough of EnCharge AIs analog in-memory architecture can be transformative for defense and aerospace use cases where size, weight, and power constraints limit how AI is deployed today, said Dan Ateya, president and managing director of RTX Ventures. Continuing our collaboration with EnCharge AI will help enable AI advancements in environments that were previously inaccessible given the limitations of current processor technology. Dozens of companies are developing new kinds of chips and other architecture to grapple with the energy and computing challenges of AI training and inference. Startups like Cerebras Systems, Samba Nova Systems, and Graphcore, acquired last year by Japans SoftBank, have sought to compete with Nvidia in the AI training market.  Cerebras, which sells its giant AI chips and offers services to customers through the cloud, filed paperwork in September to list its shares on the Nasdaq in an initial public offering. In its IPO prospectus, the company said it expects the AI computing market to grow from $131 billion in 2024 to $453 billion in 2027. Other companies are also exploring in-memory analog computing, including startups Mythic, Neo Semiconductor, and Sagence. In a set of new papers, IBM Research scientists also demonstrated advances on analog in-memory computing, including research on a brain-inspired chip architecture for large models, as well as phase-change memory for smaller edge-sized models, and algorithm advances. Analog in-memory computing “could substantially improve the energy efficiency of LLMs by leveraging mixture of experts (MoEs) models,” according to one of the studies, which is featured on the January cover of the journal Nature Computational Science. The Defense Department also continues to pursue analog computing. The Defense Innovation Unit (DIU) on Monday released a solicitation for a “digital engineering platform” to accelerate the design and validation of analog chips, as well as mixed-signal, photonic, and hybrid varieties. [T]he design of these chips is often a bottleneck, with prolonged design cycles and high redesign rates, said the solicitation. Current manual design processes are time-consuming, iterative, and error-prone. Furthermore, the rising costs of prototyping and the shortage of skilled analog designers have created bottlenecks in the development pipeline. The DoD needs a way to accelerate the design process and reduce errors. Russ Klein, the program director of Siemens EDA’s high-level synthesis division, told Semiconductor Engineering in December that if an analog IMC system like EnCharges can effectively scale, it could establish a new energy standard for inference and other high performance computing.  The energy savings of not having to move all that data and the parallelism that IMC promises will massively impact not just AI, but any computation that takes place on large arrays of data, he said.


Category: E-Commerce

 

Latest from this category

22.02Pokémon cards spiked 20% in value over the past few months. Heres why
22.02Housing market map: Zillow just revised its 2025 home price forecast
22.02Did you get a 1099-K? New IRS rules will impact millions of gig workers and freelancers
22.02National Margarita Day 2025: Shake up your happy hour with these drink deals and a little bit of cocktail history
22.02Im a big believer in reading a room: Kate Aronowitz of Google Ventures on balancing business and creativity
22.02This slick new service puts ChatGPT, Perplexity, and Wikipedia on the map
22.02The next wave of AI is here: Autonomous AI agents are amazingand scary
22.02Apples hidden white noise feature may be just the productivity boost you need
E-Commerce »

All news

22.02The secretive X-37B space plane snapped this picture of Earth from orbit
22.02The creator of My Friend Pedro has a new game on the way, and it looks amazingly weird
22.02What were listening to: Bad Bunny, The Weeknd, FKA twigs and more
22.02ASUS' new mouse has a built-in aromatic oil diffuser
22.02Warren Buffett celebrates Berkshire Hathaway's success over 60 years as CEO while admitting mistakes
22.02Sebi slaps Rs 10 lakh penalty on Axis Securities for violating stock brokers rules
22.02Pokémon cards spiked 20% in value over the past few months. Heres why
22.02From nail polish to meat, Barrington couple offers products and services in a Muslim-friendly manner
More »
Privacy policy . Copyright . Contact form .