|
|||||
When I was a kid, my favorite place in the world was hunched over a sewing machine. Id cut up old jeans, hand-stitch fabric scraps into new outfits, and dream of someday seeing my clothes walk a runway. My notebooks were full of fashion drawings. Somewhere in my teens, that dream slipped quietly into the background. Life pulled me in a different direction. But this year, thanks to AI, I finally staged my first runway show at New York Fashion Week. Okay, not at the literal Fashion Week runways in Manhattan but on social media where people are scrolling for Fashion Week content. And the wild part? I pulled it together in one Friday night using my own AI-powered fashion brand, yanabanana. The tech stack behind the catwalk The show was called The Stockholm Archipelago Collection, inspired by a trip I took to Yasuragi, a Japanese-style spa perched on the water outside Stockholm. Architectural shapes, blue kimonos, and tall pines by the water were my mental mood board as I was designing my collection. Heres how I translated inspiration into a digital runway: Sketch to photo: I started with a rough sketch of each look. Using Google’s Nano Banana image generation model, I transformed my doodles into photos. Sometimes I generated two photos (a start and end scene) that would ultimately create a more interesting runway moment. Models on the runway: Through prompt engineering, I iterated until all my looks walked the same runway that I had decorated with my photos of the water view from Yasuragi. Static to cinematic: I turned the images into short clips with Midjourneys video model. It worked but Ill be experimenting with different video models next season. Runway fluidity is tricky! Custom soundtrack: Every show needs a vibe, so I used Suno to generate an original Scandinavian inspired track to set the pace. Cut & polish: Finally, I stitched it all together in iMovie, as old-school as it gets in the age of AI. The result? A minute-long AI-powered runway film that could almost pass for an indie cut of a Fashion Week show. AI is the new sewing machine What I love about this process is that AI collapsed the barrier between imagination and execution. Ten-year-old me could only dream of sourcing fabrics, hiring models, and booking a venue. Today all I need is a sketch, a stack of AI models to create virtual human models, and a little curiosity. And yet, the story didnt stop at the digital runway. From sketch to closet At one point, I even thought about building a platform where fashion designers could sketch with AI and then manufacture their garments. That idea simmered until I stumbled on Flair, an early- stage startup already doing exactly that. I joined one of their sessions with a roomful of fashion designers during San Francisco Design Week this spring. The format was like an AI version of Project Runway. Everyone created some designs, and whichever one got the most votes on their platform over the next week would be brought to life. Mine won. I sent in my measurements, and last week a package arrived. Inside was a dress that had started as a doodle on my notebook, passed through Flairs AI workflow, and emerged as a real garment stitched together in the physical world. Slipping it on for the first time was magic. It was the same rush I felt as a kid cutting up old jeans. Except this time the runway wasnt just in my imagination. It was hanging in my closet. The bigger picture For me, yanabanana isnt about building a traditional fashion house. Its about asking what does a fashion brand born in the age of AI even look like? Maybe it doesnt need to produce clothes at all. Maybe its runways live on Instagram, soundtracked by generative beats, designed with prompts instead of pins. And maybe, sometimes, those designs make the leap from pixels to fabric. And maybe thats exactly what makes it fashion-forward. Yana Welinder is Head of AI at Amplitude. She was CEO and founder of Kraftful (recently acquired by Amplitude).
Category:
E-Commerce
Today, retail giant Target Corporation (NYSE: TGT) reported its third-quarter fiscal 2025 earnings. Unfortunately, for the company and its investors, the results were a continuation of what Target has been seeing for years now: declining sales. Heres what you need to know about Targets Q3 and the impact the earnings are having on the companys stock price today. Targets Q3 2025 at a glance Heres what the big box retailer reported for its Q3 2025: Net sales: $25.3 billion (down 1.4% from the same period in 2024) Adjusted earnings per share (EPS): $1.78 (down from $1.85 in the same period in 2024) Operating income: $948 million (down 18.9%) Net earnings: $689 million (down 19.3%) To put those first two all-important metrics into perspective, net sales came in below what analysts were expecting, but the companys adjusted earnings per share came in slightly above. As CNBC notes, LSEG analysts expected Target to post revenue of $25.32 billion and an adjusted EPS of $1.72. One bright spot in Targets Q3 results was digital comparable sales, which increased 2.4%. Announcing the companys Q3 2025 earnings, Targets incoming CEO, Michael Fiddelke, who takes the helm in February, said, “Thanks to the incredible work and dedication of the Target team, our third quarter performance was in line with our expectations, despite multiple challenges continuing to face our business. Targets sales woes continue What are those “multiple challenges”? Most broadly, Target has seen stagnant or declining quarterly sales for years now. Some of those sales woes are driven by factors not unique to Target. For several years now, retailers of all stripes have been seeing customers who are more cautious about how and where they spend their discretionary dollars. This caution has largely been spurred by inflationary pressures leading to rising cost-of-living expenses. The company, like most retailers, is also facing significant competition from other big-box giants, including Walmart, as well as from online retailers like Amazon and, in more recent years, Temu and Shein. However, several factors unique to Target have also impacted its sales for quite some time. As Fast Company reported in May, customers had been complaining about messier layouts, long lines, and understaffed stores. This had led to a notable decline in customer service in many customers eyes. Finally, earlier this year, Target rolled back some of its DEI initiatives after Trump came to power. This prompted backlash and a boycott from many Target customers. Target has previously said this backlash impacted sales. All eyes on the holiday quarterand TGT stock Despite the sales decline in Q3, Target maintained its outlook for its current Q4, which includes the all-important holiday period. Yet, thats not exactly a good thing. Target had previously forecast that it expects its Q4 to see a low single-digit sales decline, and now it has confirmed that it still expects that decline (but at least, the company might argue, the decline isnt forecast to be any worse). What Target did adjust was its full fiscal 2025 forecast. Target previously said it had expected adjusted earnings per share for the year to come in at between $7 to $9. But now the company says it expects adjusted EPS for fiscal 2025 to be between $7 and $8. Targets stock reacted about as well as you would expect. As of this writing, TGT shares are currently trading down about 2.97% to $85.90 per share in premarket. The companys stock price has had a rough 2025. Since the year began, TGT shares have declined more than 34% as of yesterdays closing price of $88.53. Looking back over the past 12 months, things are even worse. During that time, TGT shares have declined more than 43% as of yesterdays close.
Category:
E-Commerce
Across nearly four decades as a teacher, principal, superintendent, funder, and now leader of a large education nonprofit organization, the experience that most shaped my view of learning wasnt a grand reform or a shiny new program. It was a Friday physics lab in Brooklyn. My students predicted a graph that couldnt exista vertical line for velocity and time. What followed was confusion, debate, trial, and error. And then discovery: Velocity requires both displacement and time. That brief struggle taught me, the teacher at the time, more about how learning really happens than any policy memo ever has. That moment endures because it represents what school should unlock every day: inquiry, persistence, and the joy of figuring something out yourself. Too often, students still move through school executing a recipe of steps without understanding ideas. In math, science, history, and English language arts, they follow the recipe and miss the point. That approach may be tidy, but its not transformative. It shortchanges imagination, curiosity, and the a-ha! moments that make knowledge durable. HOW TO EMPOWER STUDENTS I believe that learning is only powerful if it combines agency, purpose, curiosity, and connection to empower students for the future. What does that mean? It means that learners should pursue knowledge through action. Through choice. And through voice. They should have opportunities to develop authentic and meaningful contributions. They should explore new ideas and experiences to better understand their world. And they should make connections between ideas, experiences, and people. When students are allowed to experimentto wrestle productively and recover from mistakesthey dont just master content; they build the habits of mind that matter in life and work. TECHNOLOGYS ROLE Emerging technologies hold enormous potential to make these kinds of experiences more common. They help curate simulations, prompt inquiry, and scaffold experimentation. It can create new entry points for students to explore, revise, and connect their ideas. The little moments of technology matter, toolike a 90-second BrainPOP animation that unlocks a tough concept. An interactive that prompts a classroom debate. A quick, purposeful game that turns practice into understanding. These are the sparks that turn a lesson into learning. Technology is not a recipe to follow; its a set of instruments to conduct. If we want learners who can think with and about AI, then classrooms must invite students to do what my Brooklyn High School physics class did: predict, test, argue from evidence, and revise. This last part can demonstrate the evolution in a students thinking processes and how they can move through conceptual phases of understanding. This requires commitments like access and teacher expertise, as well as ensuring quality over quantity. Im heartened to see some schools rising to meet this challenge, like the Ypsilanti Community High School in Michigan, with its new AI Lab. The first-of-its-kind collaboration between the school district, leading tech companies, and nonprofits equips students with advanced tools for AI-powered learning. This includes processors designed to handle complex AI computations, audio-visual equipment, and 3D modeling software. The lab doesnt simply build AI literacy; it allows students to explore ideas that matter to them using advanced technology. At once, they gain hands-on experiences in emerging fields while also fostering a sense of creativity and innovation. The lab challenges them to think critically, pushes them to be creative, and strengthens their real-world problem-solving skills. These are the kinds of experiences we need to provide for students to prepare them for an AI-driven world. LET STUDENTS LEARN THROUGH DOING As we increasingly integrate AI in classrooms, students must be allowed to experiment and explore with it, to argue from evidence, to fail, to productively struggle. When done right, we see the right kind of noise. That means classrooms buzzing with questions. It includes debates. And students make lifelong connections. I still remember that Brooklyn lab as if it were yesterday. Not because of the graph, but because of what it revealed: When students are trusted to do the intellectual heavy lifting, they surprise usand themselves. Our job is to design schools where discovery is not an accident, but the plan. Jean-Claude Brizard is president and CEO of Digital Promise.
Category:
E-Commerce
All news |
||||||||||||||||||
|
||||||||||||||||||