The search for extraterrestrial life represents one of humanity’s most profound scientific questsone that could fundamentally reshape our understanding of our place in the universe. Yet current telescopes face an almost impossible challenge: separating the faint glow of planets from stars that greatly outshine them.
Now, a radical new telescope design solves this problem. Unlike current circular telescopes like the Hubble or the James Webb, this design is a long rectangle, about 66 feet long by 3.3 feet tall. According to a new research study published in Frontiers in Astronomy and Space Sciences, the new design will be able to detect a record number of habitable planets in a record time span, while being easier to implement and less expensive than current and future generation of space telescopes.
The author of the study, astronomy professor Heidi Newberg and her team at the Rensselaer Polytechnic Institute in Troy, NY, says its a bizarre shape that goes against centuries of telescope building. But Newberg believes the design will vastly improve the chances of discovering extraterrestrial life.
Computer simulations detailed in the team’s research show the rectangular telescope could discover approximately 11 habitable exoplanets around the 15 closest sun-like stars in just one year of operation. Expanding to 46 target stars within 108 light-years, the mission could identify 27 potentially habitable worlds in 3.5 yearsmeeting NASA’s Habitable Worlds Observatory goal at a fraction of the cost and complexity.
[Image: Leaf Swordy/Rensselaer Polytechnic Institute]
Breaking the circle
Newberg’s design abandons the centuries-old assumption that telescopes must be circular. Instead, Newberg’s team proposes a design that delivers the resolution of a massive circular telescope while fitting into existing launch capabilities for spacecrafts. To do so, the concept leverages an optical principle: resolution depends on the longest dimension of a telescope’s primary mirror.
The technical specifications are deceptively simple and elegant. This is how it works: Imagine you’re trying to spot a firefly sitting next to a giant searchlight from miles awaythat’s essentially what astronomers face when hunting for planets around other stars.
Stars are so blindingly bright that any planets orbiting them get completely washed out. To image the planet alone, the telescope uses a device called an Achromatic Interfero Coronagrapha well-known astronomical device that is basically a sophisticated light-blocking system that works like noise-canceling headphones, but for starlight instead of sound.
It splits the incoming light into separate beams, then smashes them back together in a way that the star’s light cancels itself out while the planet’s light survives the process. The beauty is that this technique only needs to dim the star by about a million times (which sounds impossible but is actually pretty manageable with current technology), rather than the billion-times dimming required when looking at regular visible light. This makes planet hunting dramatically easier than previous methods.
The proposed telescope would use a segmented beryllium mirrorsimilar to Webb’s successful hexagonal designfolded for launch aboard a Falcon Heavy rocket. And here is where the magic comes in: Unlike circular telescopes that provide uniform resolution in all directions, the rectangular design concentrates its resolving power along a single axis.
To find planets at any angle around their stars, the telescope rotates 90 degrees between observations, effectively scanning the sky in two perpendicular orientations. This is done with a sensor that only looks at a specific type of invisible light called infrared (think heat vision) at a wavelength of about 0.0004 inches, which is where planets that could support life naturally glow the brightest from their own heat.
The problem of this approach is that you need to keep the telescope in a perfectly still point in space at all times while it rotates. Think of the astronomer like a sniper who needs to hit a target the size of a pinhead from 20 miles awayexcept the target is keeping a space telescope perfectly still while it’s hurtling through space at thousands of miles per hour.
The telescope needs to stay pointed in exactly the right direction with mind-boggling precision: if it drifts off course by even 1.25×109 radiansthe width of a human hair viewed from 500 miles awaythe whole observation gets ruined.
This is actually four times more precise than the already incredibly steady James Webb Space Telescope, and the reason is simple physicsbecause this new telescope can see finer details along one direction, it’s like having a more powerful zoom lens that amplifies every tiny wobble, so the whole system has to be that much more rock-solid to compensate. It’s a challenge, Newberg says, but it’s doable.
The design faces other technical uncertainties around structural stability, thermal control, and vibration management across its 66-foot span. However, these challenges are comparable to those NASA solved for Webb’s segmented mirror system. The space agency has extensive capabilities for “high-fidelity modeling and environmental test (cryovac and vibration)” that would apply directly to the rectangular design.
Other engineering challenges, while significant, appear solvable with existing technology too. “I have asked scientists to be more expert in space telescope vibration, flexion, and thermal stability and have gotten responses ranging from ‘might be a problem’ to ‘not a problem.’ No one has seen an obvious reason that this would not work,” Newberg tells me.
The race to find life and Earth 2.0
Multiple teams worldwide are pursuing other approaches to overcome the challenge of hunting planets with alien life, each representing billions of dollars in development costs and decades of technological advancement. All of them, however, stick with traditional round mirrors.
The LUVOIR (Large UV/Optical/IR Surveyor), for example, proposes two concepts resembling James Webb, with segmented hexagonal mirrors assembled in a circle, one 26 feet in diameter, the other 49 feet. They will be equipped with an ultra-high-contrast coronagraph capable of blocking starlight by ten billion to one. This visible-light approach demands unprecedented precision in optical engineering.
HabEx (Habitable Exoplanet Observatory) takes a different path: a 13-foot telescope paired with a massive 171-foot starshade that flies 47,600 miles away to physically block stellar light. Moving this enormous shadow between target stars would require immense fuel expenditure.
Another radical approach is the European LIFE mission, which calls fr a swarm of small telescopes flying in perfect formation, always coordinated with each other. But if the position accuracy of Newberg’s design is a challenge, LIFE’s requirements are nuts. They need to maintain positioning accuracy “precisely calibrated to the size of a typical molecule,” as Newberg describes it to me. Its a requirement, she says, that remains “currently infeasible.”
Newberg claims that her team’s design avoids all the pitfalls of its rivals. “I would argue that my concept is the ‘conservative’ one for identifying nearby, habitable exoplanets,” she tells me. “Neither the LUVOIR or HabEx proposals were selected in the National Academies Committee for a Decadal Survey on Astronomy and Astrophysics 2020 because they knew that the technology was not mature enough to develop a reasonable time and cost estimate for these missions”
Engineering trade-offs
That’s not to say that the design doesn’t have limitations. Its infrared sensor can only see a fraction of the spectrum, the one that serves to locate life, but it will not allow us to see the planet or get additional information about it.
“This means that we would need to tune the coronagraph to different wavelengths at different times and take individual exposures to observe different molecules,” Newberg explains.
She also told me that the new design will be perfect to quickly detect the alien life candidates for more detailed observations in the future: “These more complex observatories require more time to develop the technology, and would benefit from a curated list of very interesting targets to observe,” she says.
One limitation of its rectangular design is that it produces elongated “cigar-shaped” point sources rather than round images. “If the Hubble Space Telescope was rectangular, all of those beautiful images would look smeared out in one direction, so that each of the stars would be cigar-shaped,” Newberg acknowledges. For exoplanet hunting, this limitation proves irrelevantseparating two point sources matters more than image aesthetics.
So yes, the planets can be seen as separated from their stars, just not the way you’d see them with your naked eye (although the images will likely be processed into photos that look normal for public use). But as long as the evidence of life is visible, that’s all that matters.
Cost and implementation
This telescope redesign benefits both science and taxpayers. Newberg says the cost advantages of this design are substantial compared to any other. While still requiring approximately $1 billionmaking it a major space missionthe rectangular telescope would cost significantly less than alternatives demanding new technologies or multiple spacecraft.
The simpler design reduces both development risks and operational complexity, potentially accelerating the timeline to first results by decades compared to more ambitious concepts.
The rectangular concept could revolutionize high-resolution space astronomy beyond this single mission. The same principle could work at different wavelengthsa 66-foot rectangular mirror observing in visible light could theoretically detect Earth-like planets out to 650 light-years, though with far greater technical challenges.
However, the reach of roughly 100 light-years is exactly whats needed for humanitys next dream. Within around 100 light-years of Earth lie the only stars we could realistically explore with robotic probes on human timescales. “The closer the exoplanet is, the more likely we could send a probe to investigate, establish communication with its residents, or possibly one day visit,” she says.
Newberg says the telescope could enable a probe that could beam back images of the planet’s surface. “The rectangular telescope could provide a straightforward path towards identifying our sister planet: Earth 2.0, Newberg says.
Helen Toner never expected a board vote to make her a household name in tech policy. But when OpenAIs leadership crisis spilled into public view in 2023, her role as a directorand as one of the AI safety communitys prominent voicesput her at the heart of Silicon Valleys most consequential fight over the future of artificial intelligence.
That experience vaulted her into a rare position: someone trusted in both Washington, D.C., and Silicon Valley to speak plainly about the risks of AI. Now, as the new director of Georgetown Universitys Center for Security and Emerging Technology (CSET), the D.C. think tank she cofounded earlier in her decade-long policy career, shes channeling that hard-earned credibility into shaping how the U.S. confronts the technologys national security stakes.
Fast Company spoke with Toner about U.S.-China competition in AI, the growing influence of industry lobbying, and the challenges of safeguarding AI in a rapidly evolving landscape. The conversation has been edited for length and clarity.
Why did you take on this new job at Georgetown? What work can be done via a D.C. tech think tank to influence the future of AI?
CSET is a 50-person organization within Georgetown and we focus on policy research. We’re not academicswe do analysis and write papers and brief policymakers, and it’s all focused on trying to help policymakers and other decision-makers understand the implications of emerging tech for their work, in particular on the national security side. In this new role, I will be leading the organization as a whole. So helping the whole 50-person team think through managing our analysis team, which is our main researchers; our data team, which is the core of CSET’s evidence-driven, data-driven model; our operations team; and external affairs teamhelping that whole organization work together and succeed.
Is all this coming through the lens of national security?
Yes, that’s our driving focusimplications of emerging tech for national security. Of course, there are different interpretations of that. Some of our work is very squarely in that bucket, in particular work on military applications of AI or geopolitical dynamics of U.S.-China competition, which is a good chunk of our work, or AI and cybersecurity, AI and biosecurity. Different people set the boundaries of national security in different places, so we have a big effort on talent and workforce, for example, where it’s very easy to draw the national security implications of that, but it’s a little bit less DoD or intelligence community standard fare than some of our other work.
Is this mainly government-facing or is your organization going to have an influence on the wider AI industry, or on the way that the government works with the wider industry?
We definitely think of policymakers as our core audience. That includes the federal level, but also state legislators who are increasingly looking at AI as something they want to be active on. We also see a number of other audiences beyond just policymakersdecision-makers in industry for sure, the broader media to some extent, the broader public.
Were always trying to estimate whether the U.S. still has a lead in AI models, and if we are leading in robotics, automation, etc. Whats your take on the state of play? Is it possible that this new AI race is being overblown?
This is something CSET has done a lot of work on, and we’re known for [offering] really grounded analysis of what is going on in China [which] clearly wants to be competing with the U.S. I think different people, different industry leaders, different policymakers mean different things by that. So I think on both sides some people mean a genuine race to AGI or race to superintelligence. I think other people mean competition in the same sense that we talk about competitive marketstrying to win users, trying to win revenue.
The military side is one area where there’s just very clear zero-sum competition where the U.S. wants our military to be stronger and more effective than the Chinese and the Chinese want the reverse. So I think it’s very important to be thinking about how to effectively adopt AI in the military. But that’s a different question than who has the newest, shiniest model release. So to try and sum that up, I think there is real competition. What exactly that means depends on who you ask and depending on which type of competition you want to zoom in on. You want to be looking at different indicators and considering different types of success. The answer you get of who is “ahead” comes out differently depending on which area you choose to focus on.
Is the competition happening on the level of wanting the world’s AI to run on U.S.-made AI models in the same way that we want the world’s business to run on the dollar?
I think China sees itself as being a great civilization that due to various reasons missed out on the first, as they would call it, the first three industrial revolutions, and was really trailing behind in a way that wasn’t in keeping with their conception of themselves as a great power in the world. So they see AI broadly as an opportunity for them to reverse that trend and to instead be a global leader. Within China-watching circles, there are big debates about what exactly that means. What exactly does China leading in the world look like in general? Is that something that involves expansionism? Is that something that purely involves taking Taiwan back and then being satisfied with their sphere of influence there?
How would you describe the way the Chinese government involves itself in the Chinese AI industry, especially defense applications? By making grants to Chinese AI companies, can the government steer the focus of the research?
Typically, the way the Chinese government will work is less that they will directly meddle and directly go in and say, “Hey, you have to do this, you have to do that.” Typically, their preference is to set broad guidance or provide some priorities or some overarching areas of emphasis and then they’ll let companies, provincial governments, local governments kind of figure out their own way of hitting that.
So what we tend to see is less that they invest through this fund and then they go in and tell the CEO what to do and more that they will have central pronouncements or they’ll have party members on boards or they’ll have party cells inside the companies that are more gently steering along the way and also making sure that there’s a channel for information between the Party and the companies, so that when things come up there’s an ability to exert influence.
Wht did you make of the Trump administration’s AI Action Plan?
There’s a lot to like in the substance of the AI Action Plan. The rhetoric is very different from under the Biden administration, but there’s continuity on many of the underlying points, like competing with China, facilitating infrastructure buildout, and building sensible guardrails to unlock innovation. I hope the relevant agencies have the resources and the AI expertise to implement the plan thoughtfully.
Regarding the way U.S. AI companies are working in Washington, D.C., my impression is that they’re adding staff and perhaps spending more money on lobbying. Do you perceive an overall strategy by those companies to, for example, make sure that no meaningful safety regulation starts to bubble up in Congress?
I think we’ve definitely seen a real ramp-up in the size and the sophistication of AI companies efforts in D.C. Some of them have had very sophisticated efforts for a long time. You know, Microsoftthis is not their first rodeo. But I think certainly as the companies are growing, as interest in D.C. is growing in their work, they’re staffing up to deal with that.
A big motivator for CSET in the work that we do is wanting to be able to bring a perspective that is really technically informed and technically accurate to these topics. Congressional staffers or other folks in government often get [this] from the [AI] companies. The companies will tell them here’s how the technology works, how the industry works, what’s realistic or not realistic. Its important for policymakers to have that information, but you ideally want them to be getting it from a party that is operating in the interest of the public rather than the interest of the company. Our mission is to advance the public interest, not to advance our bottom line.
Do you believe U.S. AI companies are spending enough on safety research relative to their spend on regular model and application R&D? Is there even a way to measure that?
In general I don’t know that there is a clean distinction between regular R&D spending and safety R&D spending. Often there are connections between those two areas. For example, if a model tends to fail on a certain kind of question, from one lens you could say that that’s a safety problem, from another lens you could just say that that’s a usability or a capability problem.
I think the most relevant questions are more about when there are decisions that would be overall beneficial for the world but would be maybe not in their short-term business interest, what structures and processes do they have in place to make those decisions, and then do they actually follow through? Something you’ve seen, for example, is making commitments to do certain amounts of testing and then after the fact seeing off-the-record reporting that the testing they said they would do was rushed or was not completed because they were trying to launch before a competitor or something like that.
Is it your impression that the off-the-record reporting was true and that this might still be going on?
I don’t have any independent information. I just have what’s reported.
A lot of people on the West Coast are talking about whether or not there’s an AI bubble. Do you have any thoughts about that? Are AI companies focusing more on applying their models and generating revenue, and focusing less on loftier goals like AGI and superintelligence?
There’s definitely been chatter about whether we’re in a bubble here as well. The perspective that makes most sense to me is that it can both be true that some of the generative AI-focused, high-valuation VC investments in early-stage companies promising to build revolutionary products within a couple of yearsthat can be a bubble. There can be overinflated expectations there. And it can also be true that the underlying technological improvements in AI are continuing, and that the companies that are really investing in those underlying trends (the OpenAIs, the Anthropics, etc. of the world) are on to something and that they’re likely to continue succeeding and likely to see their revenues continue to rise.
Another way to say a similar thing is to point to the dot-com bust in the early 2000s, where there were investors who had gotten out over their skis and lost a lot of money. But the underlying trends were real and the underlying impacts on society were significant and continued after that bubble burst.
Many people were disappointed in OpenAI’s GPT-5, feeling like the pace of advancement toward artificial general intelligence (AGI) and superintelligence is slowing, if not stalled. What’s your take?
Two things are true about GPT-5. First, it’s evidence that we’re not on track for the very fastest scenarios toward AGI or superintelligencefor example, AGI by 2027. But second, it still fits on a trend line of steady continued advancements over the last five years. So I disagree with the sentiment that GPT-5 shows that progress is slowing down.
It seems like running an AI company, whether it’s developing models or applications, is just a really expensive business. Do you think the industry needs to find some fundamental research breakthrough to make the cost of doing this business more viable?
No, I’m not sure that they do. I think it’s actually really common for new technologiesespecially technologies that are very flexible and general purposeto take years or even decades for industry and society to figure out [how] to get the most value out of that technology. If you look back at a wide range of general-purpose technologieselectricity or the computer or different communications revolutionsthat’s been the pattern. I don’t know if we’ll see the investments keep increasing at the same rate that they have been, going up by 10X every however many years. But I do think that we’re going to keep seeing the returns on those investments keep going up as people figure out how to make use of the advances that we’ve already seen.
Want more housing market stories from Lance Lamberts ResiClub in your inbox? Subscribe to the ResiClub newsletter.
When assessing home price momentum, ResiClub believes it’s important to monitor active listings and months of supply. If active listings start to rapidly increase as homes remain on the market for longer periods, it may indicate pricing softness or weakness. Conversely, a rapid decline in active listings could suggest a market that is heating up.
Since the national Pandemic Housing Boom fizzled out in 2022, the national power dynamic has slowly been shifting directionally from sellers to buyers. Of course, across the country that shift has varied significantly.
Generally speaking, local housing markets where active inventory has jumped above pre-pandemic 2019 levels have experienced softer home price growth (or outright price declines) over the past 36 months. Conversely, local housing markets where active inventory remains far below pre-pandemic 2019 levels have, generally speaking, experienced more resilient home price growth over the past 36 months.
Where is national active inventory headed?
National active listings are on the rise (+21% from August 2024 tod August 2025). This indicates that homebuyers have gained some leverage in many parts of the country over the past year. Some sellers markets have turned into balanced markets, and more balanced markets have turned into buyers markets.
Nationally, were still below pre-pandemic 2019 inventory levels (-11% from August 2019), and some resale markets, in particular chunks of the Midwest and Northeast, still remain tight-ish.
While national active inventory is still up year over year, the pace of growth has slowed in recent monthsmore than typical seasonality would suggestas some sellers have thrown in the towel and delisted (more on that in another piece).
August inventory/active listings* total, according to Realtor.com:
August 2017 -> 1,325,358
August 2018 -> 1,285,666
August 2019 -> 1,235,257
August 2020 -> 779,558
August 2021 -> 574,638 (overheating during the Pandemic Housing Boom)
August 2022 -> 726,779
August 2023 -> 669,750
August 2024 -> 909,344
August 2025 -> 1,098,681
If we maintain the current year-over-year pace of inventory growth (+189,337 homes for sale), we’d have 1,288,018 active inventory come August 2026.
Below is the year-over-year percentage change by state:
!function(){"use strict";window.addEventListener("message",function(a){if(void 0!==a.data["datawrapper-height"]){var e=document.querySelectorAll("iframe");for(var t in a.data["datawrapper-height"])for(var r,i=0;r=e[i];i++)if(r.contentWindow===a.source){var d=a.data["datawrapper-height"][t]+"px";r.style.height=d}}})}();
While active housing inventory is rising in most markets on a year-over-year basis, some markets still remain tight-ish (although it’s loosening in those places too).
As ResiClub has been documenting, both active resale and new homes for sale remain the most limited across huge swaths of the Midwest and Northeast. Thats where home sellers this spring had, relatively speaking, more power.
In contrast, active housing inventory for sale has neared or surpassed pre-pandemic 2019 levels in many parts of the Sunbelt and Mountain West, including metro-area housing markets such as Punta Gorda, Florida, and Austin.
Many of these areas saw major price surges during the Pandemic Housing Boom, with home prices getting stretched compared to local incomes. As pandemic-driven domestic migration slowed and mortgage rates rose, markets like Tampa, Florida, and Austin faced challenges, relying on local income levels to support frothy home prices.
This softening trend was accelerated further by an abundance of new home supply in the Sunbelt. Builders are often willing to lower prices or offer affordability incentives (if they have the margins to do so) to maintain sales in a shifted market, which also has a cooling effect on the resale market: Some buyers, who would have previously considered existing homes, are now opting for new homes with more favorable deals. That puts additional upward pressure on resale inventory.
In recent months, that softening has accelerated again in West Coast markets tooincluding much of California.
!function(){"use strict";window.addEventListener("message",function(a){if(void 0!==a.data["datawrapper-height"]){var e=document.querySelectorAll("iframe");for(var t in a.data["datawrapper-height"])for(var r,i=0;r=e[i];i++)if(r.contentWindow===a.source){var d=a.data["datawrapper-height"][t]+"px";r.style.height=d}}})}();
At the end of August 2025, 14 states were above pre-pandemic 2019 active inventory levels: Alabama, Arizona, Colorado, Florida, Hawaii, Idaho, Nebraska, Nevada, Oklahoma, Oregon, Tennessee, Texas, Utah, and Washington. (The District of Columbiawhich we left out of this analysisis also back above pre-pandemic 2019 active inventory levels. Softness in D.C. proper predates the current administrations job cuts.)
!function(){"use strict";window.addEventListener("message",function(a){if(void 0!==a.data["daawrapper-height"]){var e=document.querySelectorAll("iframe");for(var t in a.data["datawrapper-height"])for(var r,i=0;r=e[i];i++)if(r.contentWindow===a.source){var d=a.data["datawrapper-height"][t]+"px";r.style.height=d}}})}();
Big picture: Over the past few years weve observed a softening across many housing markets as strained affordability tempers the fervor of a market that was unsustainably hot during the Pandemic Housing Boom. While home prices are falling some in pockets of the Sunbelt, a big chunk of Northeast and Midwest markets still eked out a little price appreciation this spring. Nationally aggregated home prices have been pretty close to flat in 2025.
Below is another version of the table above, but this one includes every month since January 2017:
!function(){"use strict";window.addEventListener("message",function(a){if(void 0!==a.data["datawrapper-height"]){var e=document.querySelectorAll("iframe");for(var t in a.data["datawrapper-height"])for(var r,i=0;r=e[i];i++)if(r.contentWindow===a.source){var d=a.data["datawrapper-height"][t]+"px";r.style.height=d}}})}();
Early on in my career, I was focused on being efficient. I wanted to be productive. I wanted to make an impact. And I thought I had mastered the email game in corporate America. Respond quickly; copy in your boss and others so they know what youre doing; hold onto emails for documentation and forward them back when people get confused.
You send too many emails, my boss said, exasperated, in one of our performance reviews. From the feedback from your peers, you email a lot. And its overwhelming the teams.
Arent we supposed to be emailing each other?” I asked, confused.
Youre supposed to be communicating. Not everything needs to be an email.
My boss was right. Somewhere along the way, I embraced email, became obsessed with email, and treated email like it was my job to email, rather than realizing that email was simply a tool to help me do my job better.
Years later, Im now sure my coworkers used to dread seeing my name in their inboxes. Over time, they likely just glossed over my name, filing it away in a folder they would never open again.
So if you suspect your coworkers might be consistently eye-rolling when your email hits their inboxes, here are three ways to course correct this behavior.
Skip that email; make time for a conversation
Early on in my career, I was anxious about inconveniencing colleagues in person. I didnt want to take up or waste their time. I defaulted to email as my primary form of communicationbut didnt realize that by sending so many emails, I was inconveniencing them (and damaging my reputation as a manager in the workplace).
I encourage all of us to pause and ask, Do I really need to send this email? Ive been guilty of wanting to empty my inbox, to just get that response or task or request into someone elses inbox as quickly as possible. If you feel similarly tempted, ask yourself if you can:
Wait to update peers at our weekly team meeting?
Stop by their desk in the morning for our question?
Ask for advice on the project at our Friday lunch?
Text or Slack them and see if they can chat for five minutes?
Think about whether you can research or answer the question yourself before hitting send?
By skipping that email, you are strengthening the way you communicate with your peers. When you can touch base in person, or over video or audio, also make sure you are efficient or brief.
Fight the urge to add to the email chain
Recently, I opened my inbox to find more than 50 responses to a reply-all chain that had spiraled out of control. I scrolled through the congrats and great news and well deserved and amazing work and on and on waiting for a breakthrough response or something I might need to know. I deleted it after the 20th message. I didnt need to read the rest of the responses.
It can be easy to reply all and pile onto the email chain gone wild. So step away from the keyboard. Instead, ask:
Why does everyone need to see our response?
What value does our response add to the conversation?
Who are we trying to prove our value to?
What if we just responded directly to that person rather than filling up everyones inbox?
Can we convey our message in person or another format?
Remember, every email we send is adding to other peoples inboxes, and in turn, we can expect emails back. So if you want to manage the flow of email, send fewer emails.
3. Just wait to hit send
Many organizations still rely on email as a primary form of communication. When you do need to send one, make sure its concise and appropriate
Ive been guilty of emailing at midnight. I wanted to get through a project, working fast and furiously and firing off emails to get what I needed done. I never stopped to think about how it would make my coworkers feel to see a barrage of messages from me if they happened to be up that late at night. What I was doing wasnt urgent and worthy of midnight emails: I was just selling lots and lots of consumer products. By consistently acting like everything was urgent, when I really did need my coworkers help, it was even harder to get them to respond.
Understanding how we can better work with our coworkers starts with how we communicate. Remember to skip that email when you can and have a quick conversation. Dont add reply all to the email madness. And if you must, just wait to send it. Unless its a real life emergency, that late-night note can wait.
If you have an aging furnace, you might have considered replacing it with a heat pumpthe ultra-efficient technology helps shrink utility bills and can have as much climate benefit as switching from a gas car to an EV. But heat pumps are also typically expensive: whole-home systems can sometimes cost $25,000 to $30,000.
Jetson, a startup based in Vancouver, says that it can cut those costs in half to $15,000, or roughly the cost of premium gas furnace. In some areas, after adding in local incentives, the cost can be as low as $5,000.
With other heat pumps, youve got this huge green premium out there, says Stephen Lake, Jetson’s founder and CEO. Thats one of the core reasons we started Jetson: to try and make this something that would become an easy yes for the average homeowner. The company launched its own heat pump, the Jetson Air, in September 2025.
[Photo: Kevin Arnold/Jetson]
Lake, who previously started a smart glasses startup that was acquired by Google, decided to work on heat pumps after looking at the biggest opportunities for decarbonization. “If you just look at the numbers across the U.S., about 15% of end energy use goes to residential heating and cooling,” he says. “It’s one of the biggest single buckets of carbon emissions out there. In many cases, your home is emitting at least as much, if not more, CO2 than the car in the driveway.”
The technology isn’t newheat pumps have been in use for decades. (Improvements that made the tech work well in very cold temperatures are newer, rolling out over the last 15 years.) But most homes still rely on fossil-fueled heating, and the upfront cost is the main barrier.
To bring down cost, the company started by eliminating markups as much as possible. Most heat pumps are made by a manufacturer, relabeled by a brand, sold to a distributor, and then sold through a contractor to a homeowner, with markups at each point. Jetson works with a manufacturer to make its own heat pumps.
[Photo: Jetson]
Then the team does its own installation. “We really optimized the install process to be a very efficient process, cutting out any wasted labor,” Lake says. “So we’re not like a typical contractor doing something different every day. We’re installing cold climate central heat pumps, basically the same system, every single day over and over.”
The company uses software to virtually plan each project, rather than having to send out a crew to take measurements in person. Then, the startup sends out HVAC technicians, an electrician, and all of the parts needed for the whole installation to happen in a single day.
The system is designed not only to reduce costs but also to minimize friction for homeowners. Typically, getting a heat pump is a multistep construction projecta homeowner would have to find HVAC contractors, schedule time for them to come give quotes, and spend time choosing between appliances. “You’re trying to navigate this complex web of rebates and incentives and then a very technical sales process around which model you want,” Lake says. Often, homeowners also need to separately hire an electrician. Jetson’s site can give a quote, and information about available rebates, within a few minutes.
The company’s heat pump uses software to continually update itself and to improve performance. To help consumers save more on bills, for example, it can time itself to run when electricity prices are lowest.
Right now, the startup only works in a few locations: British Columbia, Colorado, and Massachusetts, with New York launching shortly. Those locations all have the right conditions, Lake says, including consumer awareness of heat pumps, relatively high utility bills for oil or gas heat, and good incentives. In Massachusetts, for example, consumers can save thousands on a new heat pump through rebates.
Until the end of 2025, Americans can also use the federal tax credit of up to $2,000. But even without that incentive, the product can make financial sense. Lake says that demand has been strong; after launching the startup last October, it’s on track to install around 1,000 systems by the end of the year.
Airline loyalty programs are a multibillion-dollar business, rewarding frequent fliers and credit card users with free travel, but they can be hard to navigate.
Now, a new global ranking from Point.me of the world’s best airline rewards programs shows which ones provide the most value.
On Wednesday, the travel rewards search platform released its second annual ranking of 59 global airline loyalty programs based on award earnings, redemption, and overall value, with the aim of helping travelers make smarter decisions with their rewards.
For the second year in a row, Air France-KLMs Flying Blue ranked first globally; followed by American Airlines’ AAdvantage, which jumped from No. 6 to No. 2; and Alaska Airlines Mileage Plan (renamed Atmos Rewards in August), which rose from No. 7 to No. 3.
Flying Blue’s No. 1 ranking comes as the loyalty program expands in the U.S., with a focus on its partnerships via SkyTeam (which includes Delta Air Lines). Pros include an easy-to-use app and website and excellent customer service, while drawbacks include high fees for changes and cancellations.
Getting real value from airline loyalty points is often significantly harder for passengers than it needs to be, Adam Morvitz, CEO of Point.me, said in a statement. These rankings draw on our teams collective travel expertise and deep data-led insight from our reward search engine. This data-driven approach empowers travelers to make strategic decisions about where to invest their loyalty, ensuring theyre not just collecting points, but actually maximizing their travel wealth.
To get its results, Point.me looked at data from 22 million searches and more than 500 million search results on its platform. It went about finding the best programs using eight criteria: how easy it was to earn points (even without being a frequent traveler), how easy it was to redeem them, the rate of those redemptions, the availability of awards, overall customer service, and policies around holding, changing, or canceling flights. The results provide a road map for consumers navigating the often tricky world of air miles.
Here are the top 10 airline rewards programs in each regional category.
Top 10 airline rewards programs globally
Flying Blue
American Airlines AAdvantage
Alaska Airlines Mileage Plan
Virgin Atlantic Flying Club
United MileagePlus
The British Airways Club
Air Canada Aeroplan
JetBlue TrueBlue
Emirates Skywards
Qatar Airways Privilege Club
Top 10 airline rewards programs in North America
American Airlines AAdvantage
Alaska Airlines Mileage Plan
United MileagePlus
Air Canada Aeroplan
JetBlue TrueBlue
Southwest Rapid Rewards
Delta SkyMiles
Frontier Miles
Allegiant Allways Rewards
Spirit Airlines Free Spirit
Top loyalty programs by region
Point.me also ranked the best rewards program by region. Here are those rankings:
North America: American Airlines AAdvantage
Europe: Flying Blue
Latin America: Avianca Lifemiles
Middle East and Africa: Emirates Skywards
Asia/Oceania: Cathay Pacific Asia Miles/Singapore Airlines KrisFlyer (tie)
Point.me has earned a spot on Fast Companys Worlds Most Innovative Companies list for the past two consecutive years, in 2024 and 2025.
View the full results and complete methodology of the Worlds Best Airline Rewards Programs report here.
You are what you eat, as the saying goes. But does the same apply to what you drink?
Pinterests Summer Trend report flagged skincare drinks as a rising category, with searches up 176% on the app. Since then, the trend has spread across platforms, with TikTok creators touting skin-boosting drink recipes they claim clear complexion, racking up thousands of views.
Maybe you should drink your skincare instead of using all these products to fix your skin, TikTok creator @xarabeq suggested in a video posted back in June. Her retinol skincare recipe includes carrots, lemon, orange, ginger, and turmeric to make a week’s worth of wellness shots.
@xarabeq Drinking my skincare I want to make these retinol skincare wellness shots weekly so let me know any tips to make this process better cause CHIILLEEE it was a mess Wellness shot ingredients: 2 lemons 1 orange 2 ginger 1/2 bag of carrots 5 turmerics #drinkyourskincare #retinolskincare #clearskin #skintok #wellnessshots #healthydrink original sound – xarabeq
A viral recipe by @nelakugc uses cucumber, celery, lemon, ginger, apple, and greens to concoct a glowy skin juice. Another user recommends a daily shot of olive oil mixed with lemon juice for skincare benefits.
@nelakugc drink your skincare girlies #greenjuice #glowyskinjuice #glowyskin #skincare #healthyrecipes #girlytok #f DAISIES – Justin Bieber
Worldwide Google searches for drinks for skin and drink for glowing skin have doubled in the past month, according to Vitabiotics, the U.K.s top vitamin company. While its true that diet affects the health of your skinthe bodys largest organare these skincare drinks actually effective?
Carrots appear frequently in recipes because of their vitamin A content. Nutritionist Lucia Stansbie explains the difference between retinol in skincare and vitamin A from carrots.
This drink is said to be rich in vitamin A, but plant-based vitamin A comes in the form of beta-carotene, the pigment that gives many orange fruits and vegetables their color, she says. While vitamin A does help the maintenance of normal skin, our bodies only convert beta-carotene into active vitamin A in small amounts.
Instead of a daily shot, she suggests simply eating a carrot or adding one to your morning smoothie to maintain vitamin A levels.
Turmeric is another common ingredient in skin elixirs touted on social media. Turmeric is also an important nutrient, but it’s better absorbed with a source of fat, Stansbie says. Instead of using it in a juice, I would again use it in a smoothie where I would add an avocado or nut butter to have some healthy fats to maximize its absorption.
Celery juice is popular for its hydrating properties, but instead of juicing it, Stansbie suggests blending it and adding a source of vitamin C, one of the most powerful nutrients which contributes to normal collagen formation for the normal function of skin, along with spinach and berries. I would pair this with a protein-rich breakfast that provides vitamin B, such as vitamin B2 and biotin, which both contribute to the maintenance of normal skin.
Colin Fisher is an organizational scientist and associate professor of organizations and innovation at University College Londons School of Management. He has written about group dynamics for both popular science and management audiences, with his work having been profiled in Forbes, The Times, NPR, and the BBC.
Whats the big idea?
Why do some groups just click, while others fall apart? The Collective Edge unlocks the secrets to building a powerful group or contributing to its success as a member. With the right internal dynamics and structural foundation, a group can be poised and ready to collaborate effectively and become more than the sum of its parts.
Below, Colin shares five key insights from his new book, The Collective Edge: Unlocking the Secret Power of Groups. Listen to the audio versionread by Colin himselfbelow, or in the Next Big Idea App.
1. The lone genius is a mythtry a collective perspective.
We love stories of lone geniuses. Narratives of individuals shaping the world hold a special appeal, whether they are scientists, CEOs, performers, or prime ministers. But lone geniuses are more myth than reality. The truth is that groups make the world go round.
For instance, who invented the lightbulb? If you said Thomas Edison, youd be wrong. Incandescent bulbs were invented before Edison was born. Edison built on the work of many others, and he didnt work alone. His breakthrough was a team effort with a group he called the Muckers, whose names are mostly lost to history.
Today, teams dominate the landscape in terms of breakthrough ideas. One study of millions of patents and research papers found that teams were over six times more likely than individuals to produce breakthrough discoveries.
So why do we keep telling the wrong story? Our brains are biased. Psychologists call it fundamental attribution error: We over-explain success with personal traits and ignore the context that made it possible. We also inflate our own contributions. In one study, group members estimated their share of the teams output; the totals reached 235%.
This myth makes us worse at building teams: idolizing individual brilliance, hiring stars, purging bad apples, and hoping for lightning to strike. If we view the world from a collective perspective, we can ask better questions: What group conditions made this success possibleand how can we recreate them? Next time you admire a breakthrough, look past the most obvious hero and ask: Who else was involved, and what made their collaboration work?
2. Synergy is real, but elusive.
Synergy sounds like a buzzword, but its very real. Ive felt it as a jazz musician, where groups Ive been a part of spontaneously bring ideas out of one another that none of us could have conceived alone. Research shows that synergy is possible. Great musical ensembles, sports teams, and businesses bring together diverse knowledge, skills, and perspectives to become more than the sum of their parts.
But synergy is rare. Most groups underperform because of predictable process lossescoordination breakdowns and low effort. One classic study showed that members of a two-person group contribute only about 70% of what they would working alone. It gets worse as groups growa group of six yields only about 40% of its members output.
Still, when synergy happens, its the pinnacle of group and human performance. Miles Daviss album Kind of Blue is my favorite example. Each musician had a distinct voiceColtranes sheets of sound, Evanss lush harmonies, Daviss restraint. Together, they made each other better. Each musicians idiosyncratic approach accentuated the beauty of the others, making the whole more than the sum of the individuals.
A group of six yields only about 40% of its members output.
Anthropologist Margaret Mead was right: Never doubt that a small group of thoughtful, committed citizens can change the world. Indeed, its the only thing that ever has. But it isnt easy because our group tendencies sometimes bring out the worst in us.
3. Groups can bring out the worst in us.
Groups are often in the news for the wrong reasons: conformity, polarization, prejudice, conflict, and general mass stupidity. Politicians prey on the dark side of our primitive tendencies to praise us and blame them. Social media can strengthen intergroup hate while isolating us from our local communities. Our tendency to form groups underlies political conflict, war, and atrocities.
Our Paleolithic brains have some dangerous tendencies. One of the most pernicious is conformity. Conformity pressures are powerful and automatic. If youre in a group watching a street performer, you clap because everyone else claps, even when youre privately unimpressed. In meetings, that same instinct silences dissenting voices.
Conformity may sound bad, but it has a purpose. Conformity keeps groups together and allows them to coordinate smoothlywe sometimes need to go along to get along. But on the dark side of conformity, sometimes were pushed to conform to the will of the collective.
Conformity pressures are at the root of many catastrophic decisions, cult-like thinking, and extremism. Online echo chambers and increasing political polarization are making these forces stronger than ever. But the dark side isnt inevitable if we structure our groups carefully.
4. Use group structure to stack the deck toward synergy.
Great groups dont emerge purely by chance. Theyre purposely designed to maximize their chances of achieving synergy. A groups composition, goals, tasks, and norms collectively make up its structure. Structure is the most powerful way to build effective, happier groups. The best-designed groups are small teams working interdependently toward clear goals, with motivating tasks and norms that foster psychological safety and autonomy.
Too often, however, leaders are careless about group structure. They form teams based on politics and availability, rather than selecting the optimal mix of knowledge and skills. They charge teams with vague goals yet micromanage the process. They offer the team boring, demotivating tasks. When problems arise, many try to directly alter the group processholding meetings to diffuse conflict or giving rousing speeches to motivate disengaged members. One study found that when faced with a struggling group, 84% tried to intervene in the process, while only 5% used the most powerful lever available: changing the groups structure.
The best-designed groups are small teams working interdependently toward clear goals, with motivating tasks and norms that foster psychological safety and autonomy.
Its like they say in gambling: The house always wins. In a casino, you can win with a good strategy for a little while. But, in the long run, the odds embedded in the game will win out. Its the same in teams. Structure is simply more powerful than coaching. It should be thefirst place you turn when designing a group for synergy.
5. You can shape the groups in your lifeeven without a title.
In the best groups, leadership isnt just for whoever has the formal title of leader; its a team sport. Every group member can shape group dynamics. When you lack formal authority, you have three main ways to influence your group: asking questions, modeling norms, and attributing leadership to others.
One of the most powerful tools is asking questions. Asking questions about the goals, norms, and processes a group is using can spark important conversations. What are we trying to accomplish here? Why do we do things the way we do them? How can we improve? Questions like these invite overlooked perspectives and help get everyone on the same page.
Early in a groups life, norms emerge easily. Members look to one another for cues about whats appropriate. So, modeling norms that promote open communication and psychological safety matters enormously. If you want more candor, show it. If you want curiosity, ask thoughtful questions.
As a group member, you have a choice in who you look to as a role model, turn to for advice, or endorse their suggestions. These are small ways of attributing informal leadership to other members. Over time, informal attributions of leadership can increase the status of other group members, thereby giving them more influence over group dynamics.
Start small. Ask a better question, name an unspoken issue, or model the behavior you want to see. You dont need permission to improve a group. If you play your cards right, your group can become more than the sum of its parts.
This article originally appeared in Next Big Idea Club magazine and is reprinted with permission.
While executives debate AI strategy in boardrooms, the real disruption is already happening on the frontlines. From automated scheduling to AI-assisted diagnostics to customer service chatbots, frontline workers are increasingly interacting with intelligent systems. Yet too many organizations still treat AI as a corporate workplace issue, overlooking the people who are most exposed to its impact. Thats a mistake.
If companies want to ensure their operations stay competitive, they need to remain committed to investing in the people who are closest to the work. The frontline is the proving ground. If your AI strategy fails there, it fails everywhere.
According to a recent IBM report, 40% of workers will need to reskill in the next three years due to AI and automation. Yet many companies still deprioritize frontline education. Thats not just shortsighted, its expensive. Turnover, disengagement, and operational inefficiencies all spike when workers arent equipped to adapt.
Some companies are getting it right. Carters, CVS Health, McDonalds, and Papa Johns have all invested in education benefits that make learning accessible to hourly and frontline employees. These programs not only offer tuition assistance but provide career pathways, coaching, and short-form credentials that align with real business needs.
McDonalds is proving that frontline education isnt merely a perk but a strategic imperative. Through its Archways to Opportunity program, McDonalds and its participating franchisees offer restaurant employees access to high school completion, college tuition assistance, English language courses, and career coaching. The results are clear: over 90,000 crew members have participated, with more than $240 million invested in tuition assistance.
According to a recent survey of Archways participants, 75% say the program helped them pursue a career in a new field or industry, 79% report learning job and life skills they still use today, and 88% gained greater confidence in their abilities. Additionally, nearly two-thirds say Archways helped them earn more or get a raise, and 55% say it helped them get promoted faster. As AI reshapes frontline roles, McDonalds is leaning into the human skills that matter mostcommunication, teamwork, resilienceand equipping its workforce to thrive in a tech-enabled future.
If youre a CHRO or CEO wondering where to begin, here are three immediate actions that can drive impact:
Stop Gatekeeping Education: Too often, learning opportunities are reserved for salaried or corporate employees, leaving out the very people who keep operations runningfrontline, hourly, and part-time workers. Making education accessible means removing upfront costs, offering flexible formats that fit around shift work, and ensuring that programs deliver a clear return on investment for the learner. When companies like Carters expanded access to education benefits, they didnt just improve participationthey built stronger pipelines for internal mobility and retention.
Start Laying the Groundwork for AI Readiness: If your organization is investing in automation, it must also invest in supporting workforce readiness and the long-term success of the people who will be impacted by it. That doesnt always mean launching AI-specific training on day one, but it does mean creating pathways for frontline employees to develop core technology skills and competencies, and gain future-ready credentials.
CVS Health, for example, offers no-cost access to over 80 degree and credential programs through its tuition assistance program, which includes access to AI-specific trainings. The infrastructure is in place for employees to pursue relevant skills as business needs evolve. The key is to ensure HR, L&D, and IT are aligned so that when AI adoption accelerates, your workforce is prepared, not starting from zero.
Tell Better Stories: Highlighting the real employees who are investing in their development and growth through education programs isnt just good PR, but a powerful internal engagement strategy. When employees see their peers advancing, it makes learning feel achievable and shows that growth is possible for everyone.
These stories should be shared widely, with clear pathways to opportunities for advancement, wage increases, or new roles. Papa Johns has done this well through its Dough & Degrees program, turning learners into ambassadors and reinforcing the message that growth is possible at every level of the organization.
AI isnt going to replace your workforcebut it will reveal whether youve invested in them. It will expose the gaps between the companies that talk about transformation and the ones that actually prepare their people for it. The winners in this next era wont be the ones with the most sophisticated algorithms or the biggest tech budgets. Theyll be the ones who saw AI not as a shortcut, but as a signal call to double down on human potential.
Nearly 30 years ago, when Google launched the search engine that started its long march to dominance, its founders started without much hardware.
Known at first as Backrub and operated on the Stanford campus, the companys first experimental server packed 40 gigabytes of data and was housed in a case made of Duplo blocks, the oversize version of Lego. Later, thanks to donations from IBM and Intel, the founders upgraded to a small server rack. In 2025, you cant even fit Google search in a single data center, something thats been true for a long time.
Still, with a little clever resourcing and a lot of work, you can get pretty close to a modern Google-esque experience using a machine roughly the size of that original Google server. You can even house it in your laundry room.
Thats where Ryan Pearce decided to put his new search engine, the robust Searcha Page, which has a privacy-focused variant called Seek Ninja. If you go to these web pages, youre hitting a server next to Pearces washer and dryer. Not that you could tell from the search results.
Right now, in the laundry room, I have more storage than Google in 2000 had, Pearce says. And thats just insane to think about.
Pearces DIY search engine largely eschews the cloud. The top machine leverages old server parts as well as a makeshift vent to push away the heat those parts produce. The bottom computer provides a little extra support to the setup. [Photo: courtesy of Ryan Pearce]
Why the laundry room? Two reasons: Heat and noise. Pearces server was initially in his bedroom, but the machine was so hot, it actually made it too uncomfortable to sleep. He has a separate bedroom from his wife because of sleep issues, but her prodding made him realize a relocation was necessary. So he moved it to the utility room, drilled in a route for a network cable to get through, and now, between clothes cycles, its where his search engines live. The heat hasnt been absolutely terrible, but if the door is closed for too long, it is a problem, he says.
Other than a little slowdown in the search results (which, to Pearces credit, has improved dramatically over the past few weeks), youd be hard-pressed to see where the gaps in his search engine lie. The results are often of higher quality than you might expect. Thats because Searcha Page and Seek Ninja are built around a massive database thats 2 billion entries strong. Im expecting to probably be at 4 billion documents within a half year, he says.
By comparison, the original Google, while still hosted at Stanford, had 24 million pages in its database in 1998, and 400 billion as of 2020a fact revealed in 2023, during the United States v. Google LLC antitrust trial.
By current Google standards, 2 billion pages are a drop in the bucket. But its a pretty big bucket.
The not-so-secret ingredient: AI
The scale that Pearce is working at is wild, especially given that hes running it on what is essentially discarded server hardware. The secret to making it all happen? Large language models.
What Im doing is actually very traditional search, Pearce says. Its what Google did probably 20 years ago, except the only tweak is that I do use AI to do keyword expansion and assist with the context understanding, which is the tough thing.
Pearces search engines emphasize a minimalist lookand a desire for honest user feedback.
If youre trying to avoid AI in your search, you might think, Hey, wait, is this actually what I want? But its worth keeping in mind that AI has often been a key part of our search DNA. Tools such as reverse image search, for example, couldnt work without it. Long before we learned about glue on pizza, Google had been working to implement AI-driven context in more subtle ways, adding RankBrain to the mix about a decade ago. And in 2019, Microsoft executives told a search marketing conference that 90% of Bings search results came from machine learningyears before the search engine gained a chat window.
In many ways, the frustration many users have with LLMs may oversimplify the truth about AIs role in search. It was already deeply embedded in modern search engines well before Google and Microsoft began to put it in the foreground.
And what were now learning is that AI is a great way to build and scale a search engine, even if youre an army of one.
Scaling on the cheap
In many ways, Pearce is leaning into an idea that has picked up popular relevance in recent years: self-hosting. Many self-hosters might use a mini PC or a Raspberry Pi. But when youre trying to build your own Google, youre going to need a little more power than can fit in a tiny box.
Always curious about what it would be like to build a search engine himself, Pearce decided to actually do it recently, buying up a bunch of old server gear powerful enough to manage hundreds of concurrent sessions. Its more powerful than some of Googles early server setups.
Miniaturization has just made it so achievable, he says.
Enabling this is a concept I like to call upgrade arbitrage, where extremely powerful old macines (particularly those targeting the workstation or server market) end up falling in price so significantly that it makes the gear attractive to bargain hunters. Many IT departments work around traditional upgrade cycles, usually around three years, meaning theres a lot of old gear on the market. If buyers are willing to accept the added energy costs that come with the older gear, savvy gadget shoppers can get a lot of power for not a lot of up-front money.
The beefy CPU running this setup, a 32-core AMD EPYC 7532, underlines just how fast technology moves. At the time of its release in 2020, the processor alone would have cost more than $3,000. It can now be had on eBay for less than $200and Pearce bought a quality control test version of the chip to further save money.
I could have gotten another chip for the same price, which would have had twice as many threads, but it would have produced too much heat, he says.
Wilson Lins cloud-based search engine, which uses a vector database, includes short summaries of every post produced by LLMs, which vary in length.
What he built isnt cheapthe system, all in, cost $5,000, with about $3,000 of that going toward storagebut its orders of magnitudes less expensive than the hardware would have cost new. (Half a terabyte of RAM isnt cheap, after all.) While there are certain off-site things that Pearce needs to lean on, the actual search engine itself is pulled in from this box. Its bigger than a bread box, but a lot smaller than the cloud.
This is not how many developers approach complex software projects like this nowadays. Fellow ambitious hobbyist Wilson Lin, who on his personal blog recently described his efforts to create a search engine of his own, took the opposite approach from Pearce. He developed his own data parsing technologies to shrink the cost of running a search engine to pennies on the dollar compared to competing engines, leaning on at least nine separate cloud technologies.
Its a lot cheaper than [Amazon Web Services]a significant amount, Lin says. And it gives me enough capacity to get somewhere with this project on a reasonable budget.
Why are these developers able to get so close to what Google is building on relatively tight budgets and minimal hardware builds? Ironically, you can credit the technology many users blame for Googles declining search qualityLLMs.
Catching up via LLMs
One of the biggest points of controversy around search engines is the overemphasis on artificial intelligence. Usually the result shows up in a front-facing way, by trying to explain your searches to you. Some people like the time savings. Some dont. (Given that I built a popular hack for working around Googles AI summaries, it might not surprise you to learn that I lean in the latter category.)
But when youre attempting to build a dataset without a ton of outside resources, LLMs have proven an essential tool for reaching scale from a development and contextualization standpoint.
Pearce, who has a background in both enterprise software and game development, has not shied away from the programming opportunity that LLMs offer. Whats interesting about his model is that hes essentially building the many parts that build up a traditional search engine, piecemeal. He estimates his codebase has around 150,000 lines of code at this juncture.
And a lot of that is going back and reiterating, he says. If you really consider it, its probably like Ive iterated over like 500,000 lines of code.
Much of his iteration comes in the form of taking features initially managed by LLMs and writing them to work more traditionally. Thats created a design approach that allows him to build complex systems relatively quickly, and then iterate on whats working.
I think its definitely lowered the barrier, Lin says of the LLMs role in enabling DIY search engines. To me, it seems like the only barrier to actually competing with Google, creating an alternate search engine, is not so much the technology, its mostly the market forces.
Seek Ninja, the more private of Pearces two search engines, does not save your profile or use your location, making it a great incognito-mode option.
The complexity of LLMs is such that it is one of the few things Pearce cant implement on-site in his laundry room setup. Searcha Page and Seek Ninja instead use a service called SambaNova, which provides speedy access to the Llama 3 model at a low cost.
Annie Shea Weckesser, SambaNovas CMO, notes that access to low-cost models is increasingly becoming essential for solo developers like Pearce, adding that the company is giving developers the tools to run powerful AI models quickly and affordably, whether theyre working from a home setup or running in production.
Pearce has other advantages that Sergey Brin and Larry Page didnt have three decades ago when they founded Google, including access to the Common Crawl repository. That open repository of web data, an important (if controversial) enabler of generative AI, has made it easier for him to build his own crawler. Pearce says he was actually blocked from Common Crawl at one point as he built his moonshot.
I really appreciate them. I wish I could give them back something, but maybe when Im bigger, he says. Its a really cool organization, and I want to be less dependent on them.
Small scale, big ambitions
There are places where Pearce has had to scale back his ambitions somewhat. For example, he initially thought hed build his search engine using a vector database, which relies on algorithms to connect closely related items.
But that completely bombed, he says. It was probably a lack of skill on my part. It id search, but . . . the results were very artistic, lets say, hinting at the fuzziness and hallucination that LLMs are known for.
Vector search, while complex, is certainly possible; thats what Lins search engine uses, in the form of a self-created tool called CoreNN. That presents results differently from Pearces search engine, which works more like Google. Rather than using the meta descriptions most web pages have, it uses an LLM to briefly summarize the page itself and how it relates to the users search term.
Once I actually started, I realized this is really deep, Lin says of his project. Its not a single system, or youre just focused on like a single part of programming. Its like a lot of different areas, from machine learning and natural language processing, to how do you build an app that is smooth and low latency?
Pearces Searcha Page is surprisingly adept at local searches, and can help find nearby food options quickly, based on your location.
And then theres the concept of doing a small-site search, along the lines of the noncommercial search engine Marginalia, which favors small sites over Big Tech. That was actually Pearces original idea, one that he hopes to get back to once he nails down the slightly broader approach hes taken.
But there are already ideas emerging that werent even on Pearces radar.
Someone from China actually reached out to me because . . . I think he wanted an uncensored search engine that he wanted to feed through his LLM, like his agents search, he says.
Its not realistic at this time for Pearce to expand beyond Englishbesides additional costs, it would essentially require him to build brand-new datasets. But such interest hints at the sheer power of his idea, which, based on its location, he can literally hear.
He does see a point where he moves the search engine outside his homehes a cloud-skeptic, so it would likely be to a colocation facility or similar type of data center. (Helping to pay for that future, he has started to dabble in some modest affiliate-style advertising, which tends to be less invasive than traditional banner ads.)
My plan is if I get past a certain traffic amount, I am going to get hosted, Pearce says. Its not going to be in that laundry room forever.