The sound of crickets isnt always a sign of a peaceful night; sometimes, its the deafening silence of unasked questions in a virtual meeting, or an email left unread in an overflowing inbox. Especially as hybrid and remote work become the norm, communication silos are quietly eroding company culture, stalling execution, and capping growth. A 2024 report reveals that miscommunication costs companies with 100 employees an average of $420,000 per year. This is the why arent we working moment.
Ive spent years observing how companies thrive or falter, and its clear that communication isnt a soft skill, but a strategic system. The next generation of high-performing executives will stand out by communicating clearly, consistently, and across every level of the organization.
Here are five strategies to transform your communication and scale your company culture:
1. TREAT COMMUNICATION AS A TWO-WAY SYSTEM
Many leaders view communication as a one-way street: I have the idea, we have the plan, now we just have to cascade it down. However, this top-down approach misses a crucial opportunity, especially in larger organizations where people can easily get bombarded with information. When messages are constantly flowing downward, it becomes difficult for employees to discern whats a priority to read, leading to important information getting lost.
Instead, you should rethink communication as a two-way system. This means creating space for questions and input from your team regarding the information being shared. For instance, rather than just sending out a weekly division email with mandatory and optional reads, actively solicit feedback or hold quick discussions in weekly team meetings to ensure key information is understood and to create a dialogue around it. This shift from a purely distributive model to an interactive one ensures that your communication is processed, understood, and acted upon.
2. CHALLENGE THE TOP-DOWN MINDSET IN HYBRID ENVIRONMENTS
Most companies falter in scaling culture in hybrid or remote environments by relying solely on a top-down approach. The assumption is often that those in management positions have the best ideas for keeping everyone informed. However, in a remote setting, this often translates to an overreliance on written communication like emails and chat channels, leading to less verbal communication and actual interaction.
Instead of dictating, actively seek input from your teams on what information they want, the preferred cadence, and how to best share it in a distributed environment. Continuously check in with your team about whats working and what could be better regarding communication strategies. What works today might not be effective next month, so being willing to adapt and evolve your approach is crucial for sustained growth.
3. BUILD CONNECTION TO BREAK DOWN SILOS
The most damaging communication silos emerge when people arent connected, a problem exacerbated in remote environments. To dismantle these silos, build connection directly into your team processes. Start by involving and engaging team members in the hiring process of their peers, which is a foundational step toward creating relationships and making communication easier. If a position youre hiring interacts with another department, include someone from that team in the hiring process. Youre building connection and communication from the start.
Beyond hiring, work with your team to identify and establish clear expectations for how youll work together, support one another, and communicate. These team agreements should be collaborative guidelines that foster commitment and ownership because the team themselves generated the ideas. For instance, a team agreement could be to go direct when issues arise, preventing festering problems and encouraging proactive, respectful dialogue to gain clarity or get things back on track.
4. EMBRACE TRANSPARENCY, ESPECIALLY DURING TOUGH TIMES
Effective communication built on trust and transparency can lead to remarkable outcomes, even in the face of significant challenges. We once worked with a client that had fostered a culture of high performance, characterized by open, two-way communication and a belief in their team members capabilities. When they lost a major customer, facing the need to reduce costs quickly without layoffs, they mobilized cross-functional teams involving employees from all levels, from senior leadership to production line workers.
Within 60 days, these teams identified over a million dollars in cost savings. This success significantly boosted morale and financial gains. Employees felt empowered and excited by their collective contribution, asking, Whats our next goal? This example highlights how transparent communicationespecially when delivering tough newsand actively involving employees in finding solutions can galvanize a workforce and lead to both execution gains and enhanced morale.
5. ASK MORE OPEN-ENDED QUESTIONS
The most impactful communication habit you should adopt is simple: Ask questions. Encourage your direct reports to ask their teams questions like, What are we doing to improve communication within our group? or What ideas do your teams have for ways to improve communication? This approach signals the importance of communication as a strategic element and encourages a different kind of thinking and action within teams. After all, people typically do what they are asked about.
Open-ended questions are particularly effective as they prompt deeper thought and allow for a broader exploration of ideas, helping you paint a bigger picture of your vision when clarifying questions arise. This fosters a more engaged, two-way conversation that leads to greater commitment and better solutions from your teams.
By approaching communication as a two-way street, challenging top-down norms, and asking strategic questions, you can empower your teams and ensure your culture thrives, no matter how much your organization scales.
If you ask a calculator to multiply two numbers, it multiplies two numbers: end of story. It doesnt matter if youre doing the multiplication to work out unit costs, to perpetuate fraud, or to design a bombthe calculator simply carries out the task it has been assigned.
Things arent always so simple with AI. Imagine your AI assistant decides that it doesnt approve of your companys actions or attitude in some area. Without consulting you, it leaks confidential information to regulators and journalists, acting on its own moral judgment about whether your actions are right or wrong. Science fiction? No. This kind of behavior has already been observed under controlled conditions with Anthropics Claude Opus 4, one of the most widely used generative AI models.
The problem here isn’t just that an AI might “break” and go rogue; the danger of an AI taking matters into its own hands can arise even when the model is working as intended on a technical level. The fundamental issue is that advanced AI models don’t just process data and optimize operations. They also make choices (we might even call them judgments) about what they should treat as true, what matters, and what’s allowed.
Typically, when we think of AIs alignment problem, we think about how to build AI that is aligned with the interests of humanity as a whole. But, as Professor Sverre Spoelstra and my colleague Dr. Paul Scade have been exploring in a recent research project, what Claudes whistleblowing demonstrates is a subtler alignment problem, but one that is much more immediate for most executives. The question for businesses is, how do you ensure that the AI systems you’re buying actually share your organization’s values, beliefs, and strategic priorities?
Three Faces of Organizational Misalignment
Misalignment shows up in three distinct ways.
First, theres ethical misalignment. Consider Amazon’s experience with AI-powered hiring. The company developed an algorithm to streamline recruitment for technical roles, training it on years of historical hiring data. The system worked exactly as designedand that was the problem. It learned from the training data to systematically discriminate against women. The system absorbed a bias that was completely at odds with Amazons own stated value system, translating past discrimination into automated future decisions. Second, theres epistemic misalignment. AI models make decisions all the time about what data can be trusted and what should be ignored. But their standards for determining what is true wont necessarily align with those of the businesses that use them. In May 2025, users of xAI’s Grok began noticing something peculiar: the chatbot was inserting references to “white genocide” in South Africa into responses about unrelated topics. When pressed, Grok claimed that its normal algorithmic reasoning would treat such claims as conspiracy theories and so discount them. But in this case, it had been “instructed by my creators” to accept the white genocide theory as real. This reveals a different type of misalignment, a conflict about what constitutes valid knowledge and evidence. Whether Groks outputs in this case were truly the result of deliberate intervention or were an unexpected outcome of complex training interactions, Grok was operating with standards of truth that most organizations would not accept, treating contested political narratives as established fact.
Third, theres strategic misalignment. In November2023, watchdog group MediaMatters claimed that Xs (formerly Twitter) adranking engine was placing corporate ads next to posts praising Nazism and white supremacy. While X strongly contested the claim, the dispute raised an important point. An algorithm that is designed to maximize ad views might choose to place ads alongside any highengagement content, undermining brand safety to achieve the goals of maximizing viewers that were built into the algorithm. This kind of disconnect between organizational goals and the tactics algorithms use in pursuit of their specific purpose can undermine the strategic coherence of an organization.
Why Misalignment Happens
Misalignment with organizational values and purpose can have a range of sources. The three most common are:
Model design. The architecture of AI systems embeds philosophical choices at levels most users never see. When developers decide how to weight different factors, they’re making value judgments. A healthcare AI that privileges peer-reviewed studies over clinical experience embodies a specific stance about the relative value of formal academic knowledge versus practitioner wisdom. These architectural decisions, made by engineers who may never meet your team, become constraints your organization must live with.
Training data. AI models are statistical prediction engines that learn from the data they are trained on. And the content of the training data means that a model may inherit a broad range of historical biases, statistically normal human beliefs, and culturally specific assumptions.
Foundational instructions. Generative AI models are typically given a foundational set of prompts by developers that shape and constrain the outputs the models will give (often referred to as “system prompts” or “policy prompts” in technical documentation). For instance, Anthropic embeds a “constitution” in its models that requires the models to act in line with a specified value system. While the values chosen by the developers will normally aim at outcomes that they believe to be good for humanity, there is no reason to assume that a given company or business leader will agree with those choices.
Detecting and Addressing Misalignment
Misalignment rarely begins with headlinegrabbing failures; it shows up first in small but telling discrepancies. Look for direct contradictions and tonal inconsistenciesmodels that refuse tasks or chatbots that communicate in an off-brand voice, for instance. Track indirect patterns, such as statistically skewed hiring decisions, employees routinely correcting AI outputs, or a rise in customer complaints about impersonal service. At the systemic level, watch for growing oversight layers, creeping shifts in strategic metrics, or cultural rifts between departments running different AI stacks. Any of these are early red flags that an AI systems value framework may be drifting from your own.
Four ways to respond
Stresstest the model with valuebased redteam prompts. Take the model through deliberately provocative scenarios to surface hidden philosophical boundaries before deployment.
strong>Interrogate your vendor. Request model cards, trainingdata summaries, safetylayer descriptions, update logs, and explicit statements of embedded values.
Implement continuous monitoring. Set automated alerts for outlier language, demographic skews, and sudden metric jumps so that misalignment is caught early, not after a crisis.
Run a quarterly philosophical audit. Convene a crossfunctional review team (legal, ethics, domain experts) to sample outputs, trace decisions back to design choices, and recommend course corrections.
The Leadership Imperative
Every AI tool comes bundled with values. Unless you build every model in-house from scratchand you wontdeploying AI systems will involve importing someone elses philosophy straight into your decisionmaking process or communication tools. Ignoring that fact leaves you with a dangerous strategic blind spot.
As AI models gain autonomy, vendor selection becomes a matter of making choices about values just as much as about costs and functionality. When you choose an AI system, you are not just selecting certain capabilities at a specified price pointyou are importing a system of values. The chatbot you buy wont just answer customer questions; it will embody particular views about appropriate communication and conflict resolution. Your new strategic planning AI wont just analyze data; it will privilege certain types of evidence and embed assumptions about causation and prediction. So, choosing an AI partner means choosing whose worldview will shape daily operations.
Perfect alignment may be an unattainable goal, but disciplined vigilance is not. Adapting to this reality means that leaders need to develop a new type of philosophical literacy: the ability to recognize when AI outputs reflect underlying value systems, to trace decisions back to their philosophical roots, and to evaluate whether those roots align with organizational purposes. Businesses that fail to embed this kind of capability will find that they are no longer fully in control of their strategy or their identity.
This article develops insights from research being conducted by Professor Sverre Spoelstra, an expert on algorithmic leadership at the University of Lund and Copenhagen Business School, and my Shadoka colleague Dr. Paul Scade.
The internet wasnt born wholeit came together from parts. Most know of ARPANET, the internets most famous precursor, but it was always limited strictly to government use. It was NSFNET that brought many networks together, and the internet that we use today is almost NSFNET itself.
Almost, but not quite: in 1995, the government that had raised the internet from its infancy gave it a firm shove out the door. Call it a graduation, or a coming of age. I think of it as the internet getting its first real job.
In the early 1980s, the National Science Foundation sought to establish the United States as a leader in scientific computing. The plan required a fleet of supercomputers that researchers could readily use, a difficult feat when the computers routinely cost more than the buildings that housed them. Business computing had solved similar problems with time-sharing and remote terminals, and ARPANET had demonstrated that terminals could be connected to computers across the country using a packet-switching network.
This story is part of 1995 Week, where well revisit some of the most interesting, unexpected, and confounding developments in tech 30 years ago.
The Computer Science Network, or CSNET, was the NSFs first foray into wide area networking. It connected universities that didnt have defense contracts and, as a result, had been left out of ARPANET. With dozens of sites, CSNET was much smaller than ARPANET but proved that a group of universities could share computing resources.
When the NSF funded five cutting-edge supercomputing centers in 1985, it planned to make them available to users over a similar network. The problem was that big computers invited big data: CSNET just wasnt fast enough for interactive work with large data sets, and it was falling further behind as traffic doubled about every two weeks. After a sluggish 56 Kbps pilot effort (about a thousand times slower than todays common broadband connections), the NSF contracted the University of Michigan to develop an all-new replacement based on MERITa Michigan inter-university network that had already started to expand its high-speed digital telephone and geostationary satellite links into other states. In 1987, the MERIT team brought on IBM and upstart long-distance carrier MCI, freshly invigorated by the antitrust breakup of their principal competitor and truly feeling their oats. They worked at a breakneck pace. In under a year, NSFNET connected the supercomputing centers and a half dozen regional networks at blistering T1 speeds: 1.5 Mbpsan almost 28-fold increase.
Just after 8 p.m. on June 30, 1988, Hans-Werner Braun, the projects co-principal investigator, sent an email to the NSFNET mailing list to announce these new high-capacity linksamong the fastest long-distance computer connections ever deployedwith typical scientific understatement: The NSFnet Backbone has reached a state where we would like to more officially let operational traffic on.
[Image: reivax/Flickr]
Brauns email received little notice at the time, the NSF wrote in a 2008 announcement. But those simple words announced the birth of the modern Internet.
NSFNET was a runaway success. Besides its massive capacity, the network maintained an open door for interconnection. Overseas academic computer networks established peer connections with NSFNET, and in 1989 the federal government opened two Federal Internet Exchanges that routed traffic between NSFNET, ARPANET, and other government networks. The superior speed of NSFNET meant that these exchanges served mostly to bring NSFNET to federal users, and ARPANETs fate was sealed. The military network, birthplace of many internet technologies, was deemed obsolete and decommissioned the next year. At the turn of the 1990s, NSFNET had become the Internet: the unified backbone by which regional and institutional networks came together.
NSFNET never stopped growing. It was a remarkable problem: at every stage, NSFNET traffic grew faster than anticipated. During 1989 alone, traffic increased by five times. The state of the art T1 links were overwhelmed, demanding a 1991 upgrade to 45 Mbps T3 connections. To manage the rapidly expanding infrastructure, the original NSFNET partners formed Advanced Network and Services (ANS). ANS was an independent nonprofit that could be called the first backbone ISP, the service provider that service providers themselves connected to.
[Image: Merit Network, Inc., NCSA, and the National Science Foundation/Wikimedia Commons]
The popularity of this new communications system was not limited to government and academia. Private industry took note as well. During the 1980s, online services had sprouted: companies like CompuServe, PlayNet, and AOL that are often considered early ISPs but were, in fact, something else. Online services, for both businesses and consumers, were walled gardens. They descended from time-sharing systems that connected users to a single computer, providing only a curated experience of software provided by the online service itself.
The internet, in the tradition of ARPANET and especially NSFNET, was very different. It was a collection of truly independent networks, autonomous systems, with the freedom to communicate across geographical and organizational boundaries. It could feel like chaos, but it also fostered innovation.
The internet offered possibilities that the online services never could. Douglas Van Houweling, director of the MERIT office, called NSFNETs university origin he only community that understands that great things can happen when no ones in charge.
At first, it was contractors who took their business to the internet. ARPANET had always been strictly for government business, but still, companies with the privilege of ARPANET connections found it hard not to use them for other work. Despite prohibitions, ARPANET users exchanged personal messages, coordinated visits, and even distributed the first spam. NSFNETs much wider scope, welcoming anyone with a nexus to research or education, naturally invited users to push the limits further.
Douglas Van Houweling [Photo: ImaginingtheInternet/Wikimedia Commons]
Besides, the commercial internet was starting to form. CERN engineer Tim Berners-Lee had invented HTML and, along with it, the World Wide Web. In 1993, NCSAone of the same NSF supercomputing centers that NSFNET was built to connectreleased Mosaic, the first popular web browser. Early private ISPs, companies like PSINet and Cerfnet, started out as regional academic networks (New York and Californias). There was obvious business interest, and for cash-strapped academic networks paying customers were hard to turn down. NSFNET went into business on its own, with ANS establishing its own for-profit commercial subsidiary called ANS CO+RE.
The term internet backbone still finds use today, but in a less literal sense. NSFNET truly was the spine of the early 1990s internet, the only interconnection between otherwise disparate networks. It facilitated the internets growth, but it also became a gatekeeper: NSF funding came with the condition that it be used for research and education. NSFNET had always kept a somewhat liberal attitude towards its users online activities, but the growth of outright for-profit networks made the conflict between academia and commerce impossible to ignore.
Several commercial ISPs established their own exchange, an option for business traffic to bypass NSFNET, but it couldnt provide the level of connectivity that NSFNET did. Besides, ANS itself opposed fragmentation of the internet and refused to support direct interconnection between other ISPs. In 1992, a series of NSFNET policy changes and an act of Congress opened the door to business traffic on a more formal basis, but the damage was done. A divide had formed between the internet as an academic venture and the internet as a business, a divide that was only deepened by mistrust between upstart internet businesses and incumbent providers ANS, IBM, and MCI.
The network was not the only place that cracks formed. Dating back to ARPANET, a database called the Domain Name System maintained a mapping between numeric addresses and more human-friendly names. While DNS was somewhat distributed, it required a central organization to maintain the top level of the hierarchy. There had been different databases for different networks, but consolidation onto NSFNET required unifying the name system as well. By 1993, all of the former name registries had contracted the work to a single company called Network Solutions.
At first, Network Solutions benefited from the same federal largesse as NSFNET. Registry services were funded by government contracts and free to users. Requests came faster and faster, though, and the database grew larger and larger. In 1995, Network Solutions joined the ranks of the defense industrial complex with an acquisition by SAIC. Along with the new owner came new terms: SAIC negotiated an amendment to the NSF contracts that, for the first time, introduced a fee to register a domain name. Claiming a name on the internet would run $100 per two years.
By then, commercial ISPs had proliferated. Despite policy changes, NSFNET remained less enthusiastic about commercial users than academic ones. Besides, traffic hadnt stopped growing, and improved routing technologies meant the network could scale across multiple routes. The internet became competitive. MCI, benefiting from their experience operating NSFNET links, had built its own backbone network. Sprint, never far behind MCI, had one too. ANS reorganized their assets, placing much of their backbone infrastructure under their commercial operations. Government support of the increasingly profit-driven internet seemed unwise and, ultimately, unnecessary.
In April of 1995, the internet changed: NSF shut down the NSFNET backbone. The government funded, academically motivated core of the internet was replaced by a haphazard but thriving interconnection of commercial ventures. ANS, now somewhat lost for purpose, stepped out into the new world of internet industry and sold its infrastructure to AOL. Network Solutions became embroiled in a monopoly controversy that saw DNS reorganized into a system of competitive private registrars. Modems became standard equipment on newly popular personal computers, and millions of Americans dialed into a commercial ISP. We built communities, businesses, and the shape of the 21st century over infrastructure that had been, just years before, a collection of universities with an NSF grant.
The internet, born in the 1960s, spent its young adult years in the university. It learned a lot: the policies, the protocols, the basic shape of the internet, all solidified under the tutelage of research institutions and the NSF. And then, the internet graduated. It went out, got a job, and found its own way. Just where that way leads, were still finding out.
President Trump sent a letter to South Korea last week threatening to levy a tariff of 25% on Korean products imported to the United States starting on August 1 unless the two countries arrive at a trade deal. More than 3% of all U.S. imports come from Korea, including cars and electronics. For some consumers, though, the more pressing question is how these tariffs are going to impact their skincare regimen.
Korea has among the most advanced and innovative beauty sectors in the world, exporting $10.2 billion in makeup and skincare products globally. Korean brands helped invent BB creams, which combine hydration with foundation; lightweight sunscreens that don’t leave any residue; and treatments that help you achieve “glass skin,” a complexion so luminous it looks like its made of glass.
Some of the hottest beauty brands in the U.S. are Korean, including Sulwhasoo, Laneige, Innisfree, and Dr.Jart+. So consumers are understandably alarmed about what will happen if the tariffs kick in. When Trump first threatened tariffs on South Korea in April, CBS reported that some Americans were panic-buying K-beauty products.
But experts believe we don’t need to go into a full-blown panic just yet. Sarah Jindal, a beauty analyst at Mintel who has studied the Korean beauty market for years, says beauty brands have high margins, which means they should be able to absorb some of the price increases from tariffs.
Moreover, since Korean beauty brands are often cheaper than comparable U.S. brands, even if prices do increase, the products are likely to remain affordable. (For instance, an Innisfree sheet mask costs $2.50; SK-II’s sheet masks are $15. Laneige’s hyaluronic moisturizer costs $38; Dr. Barbara Sturm’s costs $110.) “K-beauty products are known for having high-quality, effective ingredients at a very good price,” says Jindal. “I believe they will be very resilient to the tariffs.”
Visitors enter the KINTEX exhibition hall for the 2023 K-Beauty Expo in Goyang City, South Korea. [Photo: Chris Jung/NurPhoto/Getty Images]
How Korea Became The Worlds Beauty Lab
It’s no accident that Korean beauty has become world famous, says Jennifer Carlsson, a beauty brand expert and founder of Mintoiro, a beauty consulting firm. “In Korea, there’s a very high standard for beauty, and how you look can impact your opportunities in work and life,” she says, pointing out that facial surgeries are very common in the country. “People spend a lot of time and money taking care of their skin because it has a material impact on their outcomes.”
Some surveys suggest that Koreans spend the most per capita on beauty than any other country, at $493 per year. As a result, there are hundreds of brands competing to create effective new products. There’s a thriving landscape of chemists and innovation labs that supply brands, and there are many local factories that produce the products. “The sophistication of these Korean factories exceeds the factories in China or the U.S.,” says Jindal. “The rest of the world just doesn’t have their technology.”
Carlsson, who has a database of 20,000 global beauty brands, says Korean consumers have an appetite for newness, so beauty brands are always popping up. They tend to be creative not just with their formulations but also with their beautiful packaging and marketing.
Given how the South Korean market is relatively small, with a population of just over 50 million, successful brands are eager to enter international markets to keep growing. Korean brands have courted many Asian countries, including China and Singapore, as well as Europe. But according to Jindal, the U.S. is their holy grail. “It’s a lot of work breaking into different European countries,” says Jindal. “But the U.S. has a very large, wealthy population that loves beauty. Brands that break into the U.S. market will see immediate growth.”
This is why K-beauty brands have spent the past decade making a concerted effort to reach U.S. consumers by targeting them on TikTok and Instagram and vying to get into retailers like Ulta and Sephora. Now, even as tariffs loom on the horizon, Jindal says it’s unlikely these brands will stop trying to reach Americans. “This is just too lucrative a market for them,” she says.
A wall of Korean skincare products in San Francisco [Photo: Carlos Avila Gonzalez/The San Francisco Chronicle/Getty Images]
Tariff Stress
Korean brands that export products to the U.S. are concerned about tariffs. But unlike manufacturers of cars and electronics, beauty brands tend to have more wiggle room in adjusting their pricing. According to Carlsson, manufacturing costs typically make up only 10% of the price of a beauty product. A full 50% to 70% goes to the retailer, while the rest goes to packaging, shipping, and advertising. This means that brands may be able to absorb some of the tariffs, rather than passing them on to consumers
Jindal points out that part of K-beauty’s appeal is how affordable it is. In Korea, there is intense competition among beauty brands, which has driven down prices. To maintain the quality of their products, Korean brands have found ways to manufacture more efficiently and at scale.
Jindal believes that K-beauty brands might even be able to grow their market share in the U.S. if there is a period of economic instability here. While overall inflation in the U.S. has remained relatively tame so far, economists warn that prices are beginning to rise. This is partly because companies are raising their prices as they cope with tariffs. In this environment, consumers are likely to “trade down” on their beauty products. “For luxury beauty fans, this might mean buying more K-beauty, since they can often get similar results for less money,” says Jindal.
Still, the broader feeling of economic uncertainty could have an impact on the beauty industry as a whole, particularly if consumers feel the sting of inflation. “Brands aren’t likely to take risks on new, innovative products,” Jindal says. “They’re more likely to focus on tried-and-true products that they know will sell. So this means we’re probably going to see less creativity and innovation overall.”
This is likely to be true for K-beauty brands, as well. So while there’s a good chance we’ll still have access to affordable Korean beauty over the next few months and years, we’re probably not going to see as many exciting new innovations.
When I entered the workforce, we sent documents by fax. Everyone had a landline. If you needed to do research, you went to the librarywith actual books.
Today, many of my younger colleagues would be unfamiliar with these relics of office life. Gen Z, the newest generation in the workforce, grew up with smartphones in their hands. Theyre accustomed to instant information, digital communication, and a world shaped by remote work, flexibility, and purpose-driven careers. Meaningful diversity isnt just an aspiration for them; its an expectation.
But Gen Z is also anxious. Studies show theyre more pessimistic about their futures than any other generation. The rise of AI only heightens that anxiety, especially as automated tools increasingly take over entry-level tasks like research, scheduling, and document review.
The onus is on leaders to bridge the gap between the strengths Gen Z brings and the foundational skills they need to build. Here are a few strategies for harnessing Gen Zs digital expertise while supporting their long-term career growth.
Let them lead digital initiatives
At my company, we promote an automation-first mindset. Employees are encouraged to carve out time to discover new tools and integrate them into their workflows. While optimizing systems might seem like a management responsibility, theres no reason to exclude young employees from this process. In fact, theyre ideal candidates to evaluate and experiment with emerging tech tools.
For starters, theyre digital natives. Some studies suggest Gen Z employees are up to 43% more productive when using collaborative digital tools compared to more traditional communication forms like email. Even more compelling: giving entry-level employees access to AI tools yields a higher return on investment. Research from the MIT Sloan School of Management found that new and lower-skilled workers see the biggest productivity gains from working with AI.
This approach doesnt just build future-ready skillsit lifts performance across the organization.
Invite them to shape your brand story
Gen Z employees are natural content creators. For better or worse, many of them move through the world already thinking in grids, captions, and potential for virality. Professionally, they tend to see themselves less as loyal to a single company and more as evolving personal brands.
That instinctive sense of branding can be a major asset for companies, especially when it comes to creating engaging content. This is the generation that reimagined the résumé as a short-form video. And in many cases, that makes perfect sense. Wouldnt a viral TikTok showcase your skills as a social media manager better than a bulleted list of job experiences?
Companies should be tapping into this native fluency and involving Gen Z employees in content creation, not as an afterthought, but as first-line collaborators. At my company, newer employees are deeply involved in the brainstorming process for social media initiatives. Even if senior team members shape the execution, some of our most successful campaigns have started with their ideas.
Make mentorship a two-way street
While AI and automation are eliminating much of the busywork from entry-level roles, they can’t replace the value of human mentorship, especially when it comes to developing soft skills. In an increasingly remote, digital-first work environment, those informal mentor relationships are at risk of fading. Its up to leaders to ensure they dont.
These relationships dont have to be one-way. Peer mentorship can be a two-way dialogue that benefits both parties. More experienced employees can offer guidance on communication, conflict resolution, and navigating workplace dynamicsskills that arent easily taught online. For example, how do you approach a colleague about a recurring conflict without harming your work relationship? Anyone whos worked in a team long enough knows: things run more smoothly when personal tensions are low.
At the same time, Gen Z workers can bring their older colleagues up to speed on emerging tools, platforms, and digital shortcuts. These symbiotic teaching relationships are especially important in the age of AI: executives estimate that up to 40% of their workforce will need to be reskilled over the next three yearsand not just Gen Z.
In the end, fostering a culture of shared learning across roles and generations benefits the growth and future-readiness of individuals and teams alike.
To help American visitors feel more welcome in Canada at time when relations are strained, one local tourism office is playing Canada nice.
In a 30-second spot, a tourist shown checking in at a hotel front desk tells the receptionist he doesn’t speak French and sheepishly admits, “I’m American.” For a split second, a close-up shot of the receptionist clicking a red button underneath the desk might make viewers wonder if she’s calling security, given the state of U.S.-Canada relations. But no, she’s simply opening the front desk countertop so she can go and give the man a friendly embrace. “Come hug it out in Eastern Townships” is the ad’s closing tagline.
Tourism Eastern Townships is a tourism office for a region in Quebec that’s an hour’s drive from Montreal, and the region is especially reliant on U.S. visitors since it borders Maine, New Hampshire, and Vermont. President Donald Trump’s tariffs on Canadian imports and calls for making Canada a state, though, haven’t been good for business.
Travel from the U.S. to Canada by automobile is down by 10.4%, according to data from Statistics Canada, the Canadian government’s statistical office, meaning the supply of U.S. visitors to the region that it once could count on for reliable day or weekend trips is drying up.
“Americans were actually, literally calling our hotels and attractions asking, Am I still welcome? Are people going to be nice to us if we come? Are we going to be served in English?” Tourism Eastern Townships director of visitor services Catherine Carignan-Lavasseur told the Canadian news network CTV News.
Those calls from Americans “sparked a red flag,” according to Carignan-Lavasseur, since U.S. tourists represent 6% of visitors to the region. The ad was meant to welcome them back. “The ad is a warm, humorous 30-second ad, but its also truly an invitation,” she said.
Trump’s antagonistic stance toward Canada has inspired a defensive “elbows up” response that’s shown up in Canadian consumer brand marketing and political messaging, but it goes against the stereotype of Canadians being unusually nice. While defensiveness and defiance might work well in politics, trade wars, and dealing with Trump, it’s bad for tourism, so Tourism Eastern Townships is trying an opposite approach.
For Americans considering a trip to Eastern Townships, the tourism office’s hugging ad arrives like a generous helping of warm Canadian maple syrup or a surprise Justin Bieber album at the end of a long week. While politics and borders divide us, a hug is universal. And by using an embrace to tell American tourists that they’re invited, the spot makes sure the message needs no translation. Visitors are welcome.
Computing revolutions are surprisingly rare. Despite the extraordinary technological progress that separates the first general-purpose digital computer1945s ENIACfrom the smartphone in your pocket, both machines actually work the same fundamental way: by boiling down every task into a simple mathematical system of ones and zeros. For decades, so did every other computing device on the planet.
Then there are quantum computersthe first ground-up reimagining of how computing works since it was invented. Quantum is not about processing ones and zeros faster. Instead, it runs on qubitsmore on those laterand embraces advanced physics to take computation places its never been before. The results could one day have a profound impact on medicine, energy, finance, and beyondmaybe, perhaps not soon, and only if the sectors greatest expectations play out.
The fields origins trace to a 1981 conference near Boston cohosted by MIT and IBM called Physics of Computation. There, legendary physicist Richard Feynman proposed building a computer based on the principles of quantum mechanics, pioneered in the early 20th century by Max Planck, Albert Einstein, and Niels Bohr, among others. By the centurys endfollowing seminal research at MIT, IBM, and elsewhere, including Caltech and Bell Labstech giants and startups alike joined the effort. It remains one of the industrys longest slogs, and much of the work lies ahead.
Some of quantums biggest news has arrived in recent months and involves advances on multiple fronts. In December, after more than a decade of development, Google unveiled Willow, a Post-it-size quantum processor that its lead developer, Hartmut Neven, described as a major step. In February, Microsoft debuted the Majorana 1 chipa transformative leap, according to the company (though some quantum experts have questioned its claims). A week later, Amazon introduced a prototype of its own quantum processor, called Ocelot, deeming the experimental chip a breakthrough. And in March, after one of D-Waves machines performed a simulation of magnetic materials that would have been impossible on a supercomputer, CEO Alan Baratz declared his company had attained the sectors holy grail. No wonder the tech industrywhose interest in quantum computing has waxed and waned over its decades-long gestationis newly tantalized.
Bluster aside, none of these developments has led to a commercial quantum computer that performs the kinds of world-changing feats the fields biggest advocates anticipate. More twists lie ahead before the field reaches maturityif it ever doeslet alone widespread adoption. A 2024 McKinsey study of the sector reflects this uncertainty, projecting that the quantum computing market could grow from between $8 billion and $15 billion this year to between $28 billion and $72 billion by 2035.
Whatever comes next isnt likely to be boring. So heres a brief overview on computings next big thing, which many have heard of but few of us fully understand.
1. Lets start with why. What exactly can quantum computers do that todays supercomputers cant?
Were talking about a computer that could, theoretically, unleash near-boundless opportunity in fields that benefit from complex simulations and what-if explorations. The goal of quantum computer designers isnt just to beat supercomputers, but to enable tasks that arent even possible today.
For example, Google says its new Willow quantum chip took five minutes to complete a computational benchmark that the U.S. Department of Energys Frontier supercomputer would have needed 10 septillion years to finish. Thats a one followed by 25 zeros. By that math, even a supercomputer that was a trillion times faster than Frontiercurrently the worlds second-fastest supercomputerwould require 10 trillion years to complete the benchmark.
Now, a lab-run benchmark is not the same as a world-changing feat. But Googles test results hint at what quantum might be capable of accomplishing. Like other recent milestones, its tangible proof that the technologys unprecedented potential is more than theoretical.
[Illustration: Kathleen Fu]
2. Dare I ask how quantum computers work? Do I even need to know?
As with a personal computer, you probably dont, unless youre planning to build one. Simply using onewhich you probably wont do anytime soonalso wont require familiarity with the gnarly details.
The scientific underpinnings are fascinating, though. Very quickly: Theyre based on aspects of quantum physics that can sound weird to us mere mortals. Entanglement, for example, describes the quantum connection between two or more particles or systems even if theyre light-years apart. And superposition states that a quantum system can exist in a state of multiple possibilities, all at the same time.
In quantum computers, the quantum bitor qubittakes advantage of these properties. While a conventional bit is binary, containing either a one or a zero, a qubit can have a state of one, zero, or anywhere in between. Entangled qubits work in concert, allowing algorithms to manipulate multiple qubits so they affect each other and initiate a vast, cascading series of calculations. These mind-bending capabilities are the basis of quantum computings power.
But taming quantum physics is one of the most daunting tasks scientists have ever taken on. To maintain their quantum state, many qubits must be cooled to just a skosh above absolute zero. This requirement leads to the unique look of the machines, which resemble steampunk chandeliers. A stack of suspended discsoften made of gold-plated copperbring the temperature progressively down, with snakes of cables shuttling data in and out of the qubits.
[Illustration: Kathleen Fu]
3. Im already lost.
Youre not alone! One metaphor to explain quantum involves magical pennies. Imagine youre tasked with laying out every possible heads-tails combination for the outcome of a 100-coin toss. With ordinary pennies, youd need more than a nonillion coinsthats a one followed by 30 zeros. Now imagine 100 magical pennies that can represent all the different combinations at the same time, covering every possible outcome. Much more efficient.
Back in the real world, many tasks are as complex as a 100-coin toss, and theyd quickly max out the ones and zeros of non-quantum machines, also known as classical computers. By going beyond the binary, qubits hold the promise of turning these tasks into magical-penny projects. Otherwise impossible computing work would become feasible.
[Illustration: Kathleen Fu]
4. That helpsbut give me some examples.
Quantum computing holds particular promise in biology and materials science. Eventually, a sufficiently advanced quantum machine may be able to model molecular structures with unprecedented precision, potentially transforming everything from novel drug discovery in pharmaceuticals to the development of new kinds of batteries that would lead to cheaper EVs with greater range. Financial analysis is another prime application: Someday, a quantum computer might be better at data-intensive undertakingsportfolio optimization, securities lending, risk management, identifying arbitrage opportunitiesthan any classical computer.
Todays commercially available quantum computers cant pull off such extraordinary accomplishments. Still, some companies are already dabbling with the technology. For instance, Maryland quantum startup IonQ established a partnership with Airbus in 2022 to experiment with optimizing the process of loading cargo in a range of shapes, sizes, and weights onto aircraft with varying capacitiesthe kind of massive math problem that quantum is designed to solve.
5. Impressive! So why arent quantum computers everywhere already? Didnt you say the idea dates to 1981?
Reliability remains an issue. Even once youve cooled a quantum machine to near absolute zerono small undertakingeven the slightest temperature fluctuation or electromagnetic interference can cause qubits to perform erratically.
To mitigate, quantum designers deploy error-correction tech that pools these physical qubits into fewer, more robust logical qubits. Increasing the number of physical qubits that can be combined into logical ones is key to the industrys ambitions. The more reliable logical qubits there are in its processor, the stronger a quantum computers ability to tackle sophisticated projects.
The new Majorana 1 chip, for one, holds the potential to scale up to a million qubits in the palm of your hand, says Krysta Svore, technical fellow for advanced quantum development at Microsoft. With just eight physical qubits, the prototypebased on a property called topological superconductivityis a start. You need around 100 [logical qubits] for problems in science to outperform the classical results, Svore says. At around 1,000 reliable logical qubits, youll see industrial value in chemistry. Microsoft hopes to reach that milestone in years, not decades.
6. Who else is taking part in the race to make quantum real?
More companies than you might guess. A few are household names, such as IBM, which helped launch the whole field at that 1981 conference and deployed its first commercial quantum machine, the IBM Q System One, in 2019. Among its contributions to the field is Qiskit, an open-source software platform for writing algorithms that can be run on quantum computersnot just IBM ones, but others as well.
The overwhelming majority of players, however, are small and focused on quantum. McKinseys 2024 report counted 261 such startups that had received a total of $6.7 billion in investment104 in the U.S. and Canada, and 24 in the U.K., another hub. But McKinsey says its list is not exhaustive and that its particularly difficult to determine how much activity is going on in China, where it identified just 10 quantum computing startups.
Some of these companies are developing their own quantum computers from the ground up, often based on novel approaches. In January, for example, Toronto-based Xanadu announced Aurora, a 12-qubit machine built around photonic, or light-based, qubits rather than superconducting ones, allowing it to run at room temperature. Others are carving off specific aspects of the technology and looking for opportunities to collaborate with hardware makers. They include U.K.-based Phasecraft, which is focused on optimizing quantum algorithms and is partnering with Google and IBM, among others.
Any quantum startup would be happy to become the Google, Amazon, or Microsoft of this new computing formthough Google, Amazon, and Microsoft share the same aim and can pour their colossal resources into achieving it. Upstarts and behemoths alike are still in the early stages of a long journey to full commercialization of the technology. Figuring out which ones might dominate a few years from now is as much of a crapshoot as identifying who the internet economys ultimate winners would have been in the mid-90s.
[Illustration: Kathleen Fu]
7. Once quantum computers are humming, will classical computing go away?
Highly unlikely. From word processing to generative AI, classical computers excel at general-purpose work outside quantum computings domain. Few companies will buy their own quantum computers, which will remain complex and costly. Instead, businesses will access them as cloud services from providers like Amazon Web Services, Google Cloud, and Microsoft Azure, combining quantum machines with on-demand classical computers to accomplish work that neither could achieve on its own. Quantums ability to explore many data scenarios in parallel may make it an efficient way to train the AI algorithms that run on classical machines. We envision a future of computing thats heterogeneous, says Jay Gambetta, VP of quantum at IBM.
8. Sounds like a tool that could do a lot of goodbut also harm. What are the anticipated risks?
The one that concerns people the most is a doozy. Most internet data is secured via encryption techniques that date to the 1970sand which remain impervious to decoding by classical computers. At some point, however, a quantum machine will likely be able to do so, and quickly. Its up to the industry to start girding itself now so that businesses and individuals are protected when this eventuality comes.
Quantum-resistant, or quantum-secure, encryptionsturdy cryptographic schemes that withstand quantum-assisted crackingis possible today: Apple has already engineered its iMessage app to be quantum-resistant. Samsung has done the same with its newest backup and syncing software. But implementing such tech globally, across all industries, will prove a huge logistical challenge. You have to start that transition now, says IBMs Gambetta.
So far, the U.S. Commerce Departments National Institute of Standards and Technology (NIST) has played a critical role in bolstering cryptographic standards for the quantum era. How well that effort will survive the Trump administrations sweeping shrinkage of federal resourceswhich has already resulted in layoffs at NISTis unclear. Im a little less optimistic about our ability to do anything that requires any coordination, says Eli Levenson-Falk, an associate professor and head of a quantum lab at the University of Southern California.
[Illustration: Kathleen Fu]
9. Is there a chance that quantum computing will never amount to much?
Most experts are at least guardedly hopeful that it will live up to their expectations. But yes, the doubters exist. Some argue that scaling up the technology to a point where its practical could prove impossible. Others say that even if it does work, it will fall short of delivering the expected epoch-shifting advantages over classical computers.
Even before you get to outright pessimism, cold, hard reality could dash the most extravagant expectations for quantum as a business. In January, at the CES tech conference in Las Vegas, Nvidia CEO Jensen Huang said he thought very useful quantum machines were likely 20 years away. Huang is no hater: Nvidia is actively researching the technology, focused on how supercomputers using the companys GPU chips might augment quantum computers, and vice versa. Nevertheless, his cautious prognostication prompted Wall Street to punish the stocks of publicly held quantum companies D-Wave, IonQ, and Rigetti Computing. If enough American investors grow antsy about when theyll start seeing returns on quantum, the techs future in the U.S. could rapidly dim.
10. Twenty years! That sounds like forever.
Even the optimists cant be all that specific about when quantum computers will be solving significant problems for commercial users. Levenson-Falk believes it could happen in the next one to 10 years, and says the uncertainty has less to do with the hardware being ready than with identifying its most promising applications.
Consider the historical arc of another field that launched at a school-sponsored gathering, this time at a Dartmouth workshop in the summer of 1956: artificial intelligence. More than 50 years elapsed before the foundational breakthroughs that paved the way for generative AI; ChatGPT didnt come along until another decade after that. And were only just beginning to discover meaningful everyday applications for the technology. That quantum computing is proving to be a similarly epic undertaking shouldnt shock anyoneeven if the end of the beginning is not yet in sight.
[Illustration: Kathleen Fu]
The Strange Beauty of a Quantum Machine
Some, like this IBM Q System One, look like chandeliers. Theyre colder than outer space.
[Illustration: Kathleen Fu]
1. Pulse tube coolersAs signals move toward the computers core, things get chillier. Here begins the first stage of cooling, via the thermodynamic heat transfer of helium gas, to 4 Kelvin.
2. Thermal shieldsThese layered plates act as heat blockers, isolating each successive, increasingly cold layer from external heat.
3. Coaxial cablesThese lines carry microwave pulses down to read the quantum chips amplified output then back out. Their gold plating helps reduce energy loss.
4. Mixing chamberThe mingling of two helium isotopesgases in the atmosphere but liquids hereprovides the final blast of cooling power, bringing the temperature of the chip to 15 millikelvin.
5. Quantum amplifiersThese components amplify the chips weak quantum signals of its qubits states while minimizing noise.
6. Quantum processorAbout the size of a stamp, this chip houses the qubits where computing magic occurs.
If youve ever considered buying a used EV, now is the time.
According to a new study from the electric vehicle (EV) data analysis firm Recurrent, the used EV industry is seeing increased inventory, stable prices, and demand to rival sales of used gas cars. Recurrent’s report looks at the American used EV market from January to May 2025. The data shows that, since February, used EV inventory has been up 50% year-over-year.
The report comes in the wake of another major update for the EV market: On July 3, Congress approved new spending legislation which will end tax credits on buying new or used EVs beginning on September 30. In light of this change, experts have predicted that EV sales are likely to see a spike pre-September, followed by a decline in the months after. Heres what to know about the current used EV market:
Used EV sales are closing in on used gas cars
Recurrents data showed that, at the end of May, on average, used EVs were rolling out of car lots more quickly than their gas counterparts.
According to Recurrents analysts, some of that demand can be attributed to the impending tax credit deadlinebut there are a few other likely motivators. First, the available selection has broadened within the past few years: There are currently 70 plug-in hybrid EV and battery-powered EV models on the market. Many optionseven when usedare also newer than comparable gas cars, with 72% of used EV listings dating within the past five years. To top it off, EVs are now giving gas cars significant competition in terms of affordability.
Someone on a budget can stretch their dollar farther in the used EV market as compared with a used gas car, Recurrents report notes. 34% of inventory is priced under $25,000, and 55% is under $30,000. For comparison, in the gas car market, nearly half of all used cars were $20,000 or less in 2019. Today, only 11% are.
Despite Elon Musks drag on the brand, used Teslas remain popular
One of Recurrents more surprising findings is that, despite Teslas endless list of brand struggles this year (see its plummeting stock prices, lagging deliveries, and constant political woes), used Teslas have not seen the crash in resale value that many experts predicted. (The sole exception to this trend is Teslas Cybertruck model.)
While used Tesla prices did see a modest 2% dip in April, Recurrents report demonstrates that theyve broadly kept pace with the overall used EV market. More importantly, it adds, inventory didnt sit idle”: April saw a 27% increase in sales volume month-over-month and, by May, used Tesla market share rose to nearly 50%, and days supply fell to just 28 daysthe lowest in the industry.
The interesting thing about our Tesla findings is that they remain very popular options for used car shoppers, says Andrew Garberson, Recurrents head of growth and research. For people who want a high-range, low-mile, affordable used EV, its often a Tesla based on their market share the last 10 years. One car dealer even told us that someone looking for an affordable used EV doesnt care what Elon says or does. They just need a low-cost, reliable vehicle.
With the tax credit deadline looming, now is the time to buy
Since 2008, purchasing a new EV has come with the incentive of a $7,500 federal tax credita policy that was expanded in 2022 under the Biden administration to include a potential federal credit of up to $4,000 on used EVs. With the September 30 tax credit deadline now looming on the horizon, buyers are likely to try to get in on the deal before it’s gone.
While Garberson says it’s a little too early for Recurrent to see a spike in used EV sales resulting from the new spending bill, the team has seen a considerable increase in interest from clients. As of July 9, weekly website traffic to Recurrents used EV resources had doubled week-over-week.
The main takeaway is that now is the time for people considering an affordable used EV, Garberson says. The tax credits are helping to anchor prices to the $25,000 threshold so newer models with modern range and technology can be affordable for a lot of people. Eligible buyers can also apply the $4,000 rebate to the purchase price, even using it as a down payment. Its a deal that cant be beat.
Can the used EV market sustain this level?
So far, most experts have predicted that the end of the tax credit will be detrimental for the American EV market.
Dan Levy, Barclays auto analyst, explained in an interview with Reuters, We believe the bill reiterates the slowdown ahead for EV penetration in the U.S., with both the carrot (i.e. tax credits/incentives) and the stick (i.e. emissions regulations) softened.
Garberson says that, while lease terms indicate that an influx of used EVs is set to hit the market in 2026, its difficult to predict what that might look like without the credit.
The question on everyones mind is what happens between September 30th and the end of the year, Garberson says. I think well see that dealerships will keep inventory low while the market rebalances, which would keep sticker prices relatively stable. Itll be interesting to see how states act to backfill federal incentives. A number of states either currently offer or have proposed rebates to help their residents. But all of those factors make the crystal ball difficult to read.
Of all the social media platforms chasing users and flooding the internet with content, Nextdoor has always been a bit of an oddball. Rather than offering Instagram-style influence or Twitter-style followers or even Facebook-style endless engagement, Nextdoor has been focused on the earnest goal of connecting neighbors with each other.
Since its founding in 2011, Nextdoor’s purpose has been to be the digital window through which people can better interact with their own neighbors and neighborhoods, online and in real life.
Today, Nextdoor is relaunching and unveiling a comprehensive redesign that positions it to actually make that possible.
The new Nextdoor is moving away from its message-board past toward a more informative offering of geographically relevant real-time alerts, local news from vetted publishers, and a more accessible pool of neighborhood knowledge undergirded by artificial intelligence. The type of stuff you may have seen on Nextdoor over the past 14 yearsnotices about lost or found pets, questions about whether or not anyone else’s power is out, that cranky neighbor who posts a bit too oftenwill largely be replaced by a feed of information that prioritizes relevance and utility.
It’s the first major update to the platform since its founding. Internally, Nextdoor executives are calling this a relaunch of both the product and the company as a whole.
[Image: courtesy Nextdoor]
Founder returns
The relaunch of Nextdoor has been in the works since last fall, a few months after founder Nirav Tolia returned to the role of CEO after five years away from the day-to-day operations of the company. Tolia tells Fast Company that he came back to Nextdoor with a renewed enthusiasm for its mission of connecting neighbors, but also a clearer view of how the platform had struggled to meet its main goal.
“The potential of the Nextdoor idea had not been realized by the existing product,” Tolia says. “And the reason for that is that it’s a very hard problem. Local word of mouth, which is really what Nextdoor is all about, is one of the last things remaining to be digitized.”
That challenge hasn’t stopped Nextdoor from continuing to grow, however. According to the company’s end-of-2024 report, weekly active users of the platform grew 10% year over year, to 46.1 million in the fourth quarter, and the company made $247.3 million in revenue in 2024, up 13% from 2023. Despite these positive numbers, Tolia sees room for improvement. Others see the relaunch as a defining moment for the company . . . and a gamble.
“Our biggest challenge with the existing Nextdoor is that the content is not high-quality enough, it’s not timely enough, and it’s not comprehensive enough,” Tolia says. That’s led the company to move away from solely user-generated content to more of a user-augmented content approach, supported by geotargeted news and alert feeds from credible outside sources. For the relaunch, Nextdoor has partnered with 3,500 news outlets to provide feeds of local news, and the platform will also automatically load real-time alerts from more than 5,000 local public safety, emergency, and utility agencies.
[Image: courtesy Nextdoor]
It’s also integrating AI into one of the more quintessential parts of the Nextdoor experience, which is neighbors seeking or providing information about local services, businesses, and events. The new AI-backed search feature draws from 14 years’ worth of posts to answer user questions on things like restaurant recommendations, contractors, and family-friendly activities. Rather than just asking a question in the form of a post and hoping for a useful response, users can ask their question in the search field and get instant results.
Speeding up this feedback loop is a key part of making Nextdoor more useful to people, Tolia says. Combining neighborhood knowledge and word-of-mouth recommendations with vetted information about news and local events could make Nextdoor into what Tolia calls a “first-screen app.”
New user experience, new user behavior
A change this big requires users to shift how they interact with Nextdoor. Ahead of the relaunch project getting started, Tolia hired Georg Petschnigg as Nextdoor’s chief design officer. A seasoned user experience designer, Petschnigg is best known for his time as head of product design at The New York Times, where he led a comprehensive redesign of the newspaper’s app. For Nextdoor, he’s bringing a news-centric sensibility and a focus on getting people the information they want as quickly as possible.
[Image: courtesy Nextdoor]
Given Nextdoor’s focus on neighborhoods, the main entry point for content on the platform is the user’s location, which gets prominent placement in the top-left corner of the new Nextdoor app. “The most important thing we want to signal is that this is about your neighborhood,” Petschnigg says.
[Image: courtesy Nextdoor]
A tap in that upper-left corner takes users to a neighborhood map, which displays real-time alerts in the area, including extreme weather, fires, power outages, and police activity. If something important is happening that directly affects a user or their location, like a public safety emergency or a power outage, that information will be displayed automatically on a user’s home screen.
[Image: courtesy Nextdoor]
Other times, the home screen will be a mix of local news and posts from neighbors, all geotargeted for specific relevance in a given location. “Up until now, neighbors had to report on what’s happening, and they will do that,” Petschnigg says. “But being able to bring in these verified information sources really supercharges the dynamic on the platform.”
To spur active engagement, users can comment on the news items in this main feed. To try to avoid the cesspool that user comments sections can become, Nextdoor is using AI to create prompt questions that direct users to add their own reactions or information about the news shown in the feed. “We want to encourage the habit to discuss the news or advance the news,” Petschnigg says.
It’s unusual for a design to put news and user commentary all on the same level, Petschnigg says, but he’s hoping the focus on new stories that are relevant to a given neighborhood will become a starting point for deeper information sharing. “We’re hearing people saying they are interested in the news, but also that the conversation around the news is important,” he adds. “Framing news as information that exists to serve a community is really, really important, and we want to do that right off the bat.”
[Image: courtesy Nextdoor]
The other parts of the relaunched Nextdoor platform can be accessed by a simple floating navigation area at the bottom of the screenan approach that’s the basis of Apple’s recently announced Liquid Glass user interface. Petschnigg says Nextdoor’s UX was in the works long before Apple made this reveal, but the similarities are validating. “We started understanding that if you want to have a user interface that lets the content breathe, you need to pare back the footprint of your navigational elements,” he says.
[Image: courtesy Nextdoor]
The floating navigation bar makes the home screen and its news feed, the search function, and local recommendations (or “faves”) the three most important parts of the new Nextdoor. This user interface, and the platform’s emphasis on bringing in reliable and vetted information sources, will redirect Nextdoor from being a free-for-all message board into something more deliberately informative.
Focusing on the localthrough alerts, geotargeted news, and an AI-assisted knowledge bankis the way Tolia thinks Nextdoor can stand out from other social media platforms. “There are only about a dozen apps that we rely on every single day. The fact that none of those dozen apps is related to our local life is just kind of mind-boggling to me,” Tolia says. “Given that 30% of Americans are now working from home, where we live is more important than it’s ever been. So I think our opportunity is bigger than it’s ever been as well.”
When I was laid off from my position at Forbes, I felt completely blindsided. I had spent years associating myself with my title of marketing director at such a well-known brand, and then it was gone just like that. Like many people, I found myself staring at my laptop and outdated résumé, wondering: What now?
The job market was (and still is) challenging. I knew simply mass sending résumés out to jobs was not going to work this time. Instead, I needed to show up meaningfully, build real connections, and focus on standing out in more authentic ways online.
For someone who spent years helping brands tell their stories, I suddenly had to figure out my own, and fast. So, I decided to turn to LinkedIn to build my personal brand.
I started by fully updating and optimizing my profile. Then I identified my core content pillars, what I wanted to be known for and what felt authentic to me. I committed to posting two to three times a week, sharing practical marketing tips, personal stories, and relatable career moments. I also made a point to comment consistently and thoughtfully on other people’s content to build visibility and connections.
When I started in July 2024, I had 2,400 followers. Today, Ive grown that community to over 23,000. Growing my following has brought me new clients, brand partnerships, speaking opportunities across the country, and job opportunities.
But to be clear, I dont expect everyone to go out and try to become a LinkedIn influencer. (Unless thats what you want to do!) What I do believe is that an unexpected career change, like a layoff, can be the perfect moment to give your LinkedIn a serious makeover and set yourself up for your next opportunity.
Heres how you can use this time to revamp your LinkedIn profile, tell your story, and stand out to future employers or clients.
Audit your profile with fresh eyes
Your LinkedIn profile isnt just a digital résumé. Its your personal brand headquarters, your landing page, and your first impression. Even if you arent looking to become a content creator, a strong profile builds credibility and opens doors.
Start by looking at your profile like a recruiter or potential hiring manager would. Is it clear what you do and what youre great at? Or does it read like a list of past job titles with no story behind it?
Headline: This isnt just your current or most recent title. Its your hook. Use it to highlight what you do best and what you want to be known for, i.e. Financial services leader turning data into dollars, or I help nonprofit organizations increase community impact through organic social media strategies.
Cover photo: This is prime real estate space you can use to showcase your work, such as awards, recognitions, featured brands you’ve worked with, and more. You can say a lot with the space available right at the top of your profile.
About section: This is where you can truly tell your story and showcase who you are beyond your résumé. Instead of simply listing your skills or turning it into a “word salad” of buzzwords, focus on crafting a clear, compelling narrative that connects with your audience. Share what drives you, what you’re passionate about, and the journey that brought you to where you are today. Highlight your unique value, the problems you love to solve, and what makes you different. Think of it as your personal pitch cover letter and a space to build trust and make people want to learn more, work with you, or support your next move.
Featured section: Use this section to showcase your best work, media mentions, big wins, or thought leadership pieces. If you dont have those yet, consider adding a link to a personal website, a résumé, or a featured post that highlights your perspective or expertise. Anything that backs up your pitch from your About Me section.
Experience: Go beyond listing responsibilities. Highlight measurable results, key projects, and how you made a difference. This isn’t about bullet points or just listing a job description. Tell the reader what you did there and the results you achieved. Use data, metrics, and concrete examples to showcase your impact.
Showing up on the platform
There are different ways you can show up on LinkedIn, depending on your comfort level and goals. And there’s an untapped opportunity here, according to LinkedIn data, only 12% of users post consistently. This means you have a chance to stand out more easily.
If you want to start posting content, you should first decide what you want to talk about, which means defining your content pillars. Think about what you want to be known for, what your audience engages with, and what you can talk about consistently without burning out. For me, my pillars are marketing expertise, personal and career stories, and light, relatable corporate humor. Defining these helped me show up with intention and build trust.
You dont have to post every day or aim for viral content. Start small:
Share a lesson youve learned recently.
Talk about a challenge you overcame.
Offer insights in your area of expertise.
The more you show up, the more youll start to feel seen, and the more likely new opportunities will find you.
If you dont feel comfortable writing content and sharing, thats absolutely fine. There are other ways you can show up on the platform and engage with the community.
Reposting content from people you follow and ideas you support is a great way to share content without having to create it. Adding your thoughts to the post can be an easy way to add to the conversation.
Commenting is also a simple way to utilize LinkedIn. Commenting is like virtual networking: it helps you get noticed, build relationships, and stay top of mind. Comments often lead to profile views, connection requests, and even opportunities like collaborations or job leads. Aim to leave thoughtful, genuine comments that add value, rather than quick reactions. Support others consistently; its one of the simplest but most effective ways to grow your presence and strengthen your network.
A layoff might feel like an ending, but it can be the push you need to finally focus on yourself and your next chapter.
You might not grow your following from 2,400 to 23,000, and you dont have to. But you can turn this moment into a powerful chance to show the world who you are and what youre capable of.