Xorte logo

News Markets Groups

USA | Europe | Asia | World| Stocks | Commodities



Add a new RSS channel

 
 


Keywords

2026-02-13 10:00:00| Fast Company

Marks & Spencer is one of the latest U.K. high-street brands to launch a skiwear collection. Even supermarket Lidl is in on the action, with items in its ski range priced at less than 5 pounds (roughly $6.75). This follows earlier moves by fast-fashion retailers such as Topshop, which launched SNO in the mid 2010s, and Zaras imaginatively titled Zara Ski collection, which launched in 2023. Fast-fashion brand PrettyLittleThings Apres Ski edit (a collection of clothes chosen for a specific theme) tells potential shoppers that going skiing is not necessarily essential, which is good, because many of the products in the collection are listed as athleisure, not sportswear. Its not just the high street. Kim Kardashians shapewear brand Skims has recently collaborated with the North Face and has dressed Team USA for the 2026 Winter Olympicsthough these are strictly designed to serve the athletes during downtime, not for the piste. Alongside dedicated skiwear lines, the apres-ski aesthetic has become a recurring seasonal trend over recent years, expanding well beyond the slopes. You may have noticed the slew of ski-themed sweatshirts across the market. One of these, an Abercrombie & Fitch sweatshirt, went viral in January after a buyer noticed that the depicted resort was actually Val Thorens, Francenot Aspen, Colorado, as the text printed on the garment claimed. View this post on Instagram A post shared by kt (@outdoorkatelyn) It is not only the quality of ski-themed fashion products that is a cause for concern, but also those designed for the slope. Many of these high-street collections have received criticism from consumers, with some claiming that the garments are not fit for purpose. Meanwhile, many influencers have taken to social media to warn their followers to avoid skiing in garments from fast-fashion brands. Such were the complaints that Zara Ski reportedly renamed its products water resistant instead of waterproof. These collections respond, in part, to a genuine need for womens sportswear that is practical, fashionable, and, most critically, affordable. Ski and performance wear in general is costly, and such collections being both fashionable and relatively low-cost make for an attractive prospect. And yet, if these garments are so poorly suited to skiing, then what are they for? The visual allure of skiing Despite sports playing a key role in challenging gender ideology and perceptions of female physicality, the perceived importance of femininity and how women look while doing sports has lingered. Images of sportswomen frequently fixate on gender difference and femininity is foregrounded over athleticism. Here, the glamorous image of skiing has much to account for. Glamour relies on distance and difference to conjure a feeling of longing. For many, the novelty of eating fondue at 3,000 feet is out of reach, as is the ever-increasing price of a lift pass. Throughout the 20th century, the glamour of skiing has been defined by womens fashion. In the 1920s, Vogue magazine featured illustrations of elongated skiing women on their covers. Designer Puccis aerodynamic one-piece ski suit premiered in Harpers Bazaar magazine in 1947, while Monclers ski anoraksphotographed on Jackie Kennedy in 1966gave birth to a vision of American ski cool. Changing ski fashions were recorded in photographer Slim Aaronss resort photography, capturing the leisure class on and off piste between the 1950s and 1980s. [Image: Vogue Archive] Womens fashionable skiwear has taken many forms since the activity first became popular in the 1920s. It was during this decade that skiing became a marker of affluence. Leather, gaberdine, fur, and wool were popular materials in early womens skiwear and were selected for their natural properties; water-repellence, insulation, breathability. By the mid-century, womens skiwear became more focused on silhouette and excess fabric was considered unfeminine. Equally, skiwear gradually became more colourful, and in the fashion press women were even encouraged to match their lipstick to their ski ensemble. By the 1980s, skiwear aligned with the fashionable wedge silhouette; causing the shoulders of ski jackets to widen and salopettes (ski trousers with shoulder braces) to draw even tighter. These historic developments parallel todays aesthetic ski trend where fashion and image arguably comes before function. For example, PrettyLittleThings models are photographed on fake slopes, holding vintage skis. The glamorous image of the skiing woman lies not only in the clothing but in her stasis. The suggestion is that ski culture does not necessarily require skiing at all: It may simply involve occupying the most visible terrace, Aperol in hand. No wonder then, that so many fast-fashion ski lines for women are deeply impracticalthey appear designed less for physical exertion than for visual consumption. They sell women on the alluring glamour of skiing, while leaving them out in the cold. There is an additional irony here: Climate change means that skiing is becoming increasingly exclusive. Lower-level resorts are closing as the snow line moves up, meaning fewer options and increased demand. In this sense, the image of skiing looks to become even more glamorous via increasing inaccessibility and therefore distance. Fast-fashion has a negative impact on the environment, and the ski aesthetic risks damaging the very thing it claims to celebrate. This article features references to books that have been included for editorial reasons, and may contain links to bookshop.org. If you click on one of the links and go on to buy something from bookshop.org, The Conversation UK may earn a commission. Tamsin Johnson is a PhD candidate in visual cultures at Nottingham Trent University. This article is republished from The Conversation under a Creative Cmmons license. Read the original article.


Category: E-Commerce

 

LATEST NEWS

2026-02-13 09:30:00| Fast Company

This story was originally published by Grist. Sign up for Grists weekly newsletter here. The conversation around energy use in the United States has become . . . electric. Everyone from President Donald Trump to the cohosts of Today show has been talking about the surging demand for, and rising costs of, electrons. Many people worry that utilities wont be able to produce enough power. But a report released today argues that the better question is: Can we use what utilities already produce more efficiently in order to absorb the coming surge? A lot of folks have been looking at this from the perspective of, Do we need more supply-side resources and gas plants? said Mike Specian, utilities manager with the nonprofit American Council for an Energy-Efficient Economy, or ACEEE, who wrote the report. We found that there is a lack of discussion of demand-side measures. When Specian dug into the data, he discovered that implementing energy-efficiency measures and shifting electricity usage to lower-demand times are two of the fastest and cheapest ways of meeting growing thirst for electricity. These moves could help meet much, if not all, of the nations projected load growth. Moreover, they would cost only halfor lesswhat building out new infrastructure would, while avoiding the emissions those operations would bring. But Specian also found that governments could be doing more to incentivize utilities to take advantage of these demand-side gains.  Energy efficiency and flexibility are still a massive untapped resource in the U.S., he said. As we get to higher levels of electrification, its going to become increasingly important. The report estimated that by 2040, utility-driven efficiency programs could cut usage by about 8 percent, or around 70 gigawatts, and that making those cuts currently costs around $20.70 per megawatt. The cheapest gas-fired power plants now start at about $45 per kilowatt generated. While the cost of load shifting is harder to pin down, the report estimates moving electricity use away from peak hoursoften through time-of-use pricing, smart devices, or utility controlsto times when the grid is less strained and power is cheaper could save another 60 to 200 gigawatts of power by 2035. That alone would far outweigh even the most aggressive near-term projections for data center capacity growth.  Vijay Modi, director of the Quadracci Sustainable Engineering Laboratory at Columbia University, agrees that energy efficiency is critical but isnt sure how many easy savings are left to be had. He also believes that governments at every levelrather than utilitiesare best suited to incentivize that work. He sees greater potential in balancing loads to ease peak demand.  This is a big concern, he said, explaining that when peak load goes up, it could require upgrading substations, transformers, power lines, and a host of other distribution equipment. That raises costs and rates. Utilities, he added, are well positioned to solve this because they have the data needed to effectively shift usage and are already taking steps in that direction by investing in load management software, installing battery storage and generating electricity closer to end users with things like small-scale renewable energy.  It defers some of the heavy investment, said Modi. In turn, the customer also benefits.  Specian says that one reason utilities tend to focus on the supply side of the equation is that they can often make more money that way. Building infrastructure is considered a capital investment, and utilities can pass that cost on to customers, plus an additional rate of return, or premium, which is typically around 10 percent. Energy-efficiency programs, however, are generally considered an operating expense, which arent eligible for a rate of return. This setup, he said, motivates utilities to build new infrastructure rather than conserve energy, even if the latter presents a more affordable option for ratepayers.  Our incentives arent properly lined up, said Specian. State legislators and regulators can address this, he said, by implementing energy-efficiency resource standards or performance-based regulation. Decoupling, which separates a companys revenue from the amount of electricity it sells, is another tactic that many states are adopting.  Joe Daniel, who runs the carbon-free electricity team at the nonprofit Rocky Mountain Institute, has also been watching a model known as fuel cost sharing, which allows utilities and ratepayers to share any savings or added costs rather than passing them on entirely to customers. Its a policy that seems to make logical sense, he said. A handful of states across the political spectrum have adopted the approach, and of the people hes spoken with or heard from, Daniel said every consumer advocate, every state public commissioner, likes it.  The Edison Electric Institute, which represents all of the countrys investor-owned electric companies, told Grist that regardless of regulation, utilities are making progress in these areas. EEIs member companies operate robust energy-efficiency programs that save enough electricity each year to power nearly 30 million U.S. homes, the organization said in a statement. Electric companies continue to work closely with customers who are interested in demand response, energy efficiency, and other load-flexibility programs that can reduce their energy use and costs. Because infrastructure changes happen on long timelines, its critical to keep pushing on these levers now, said Ben Finkelor, executive director of the Energy and Efficiency Institute at the University of California, Davis. The planning is 10 years out, he said, adding that preparing today could save billions in the future. Perhaps we can avoid building those baseload assets.  Specian hopes his report reaches legislatures, regulators, and consumers alike. Whoever reads it, he says the message should be clear. By Tik Root This article originally appeared in Grist. Grist is a nonprofit, independent media organization dedicated to telling stories of climate solutions and a just future. Learn more at Grist.org.


Category: E-Commerce

 

2026-02-13 09:00:00| Fast Company

For the past two years, artificial intelligence has felt oddly flat. Large language models spread at unprecedented speed, but they also erased much of the competitive gradient. Everyone has access to the same models, the same interfaces, and, increasingly, the same answers. What initially looked like a technological revolution quickly started to resemble a utility: powerful, impressive, and largely interchangeable, a dynamic already visible in the rapid commoditization of foundation models across providers like OpenAI, Google, Anthropic, and Meta.  That flattening is not an accident. LLMs are extraordinarily good at one thinglearning from textbut structurally incapable of another: understanding how the real world behaves. They do not model causality, they do not learn from physical or operational feedback, and they do not build internal representations of environments, important limitations that even their most prominent proponents now openly acknowledge.  They predict words, not consequences, a distinction that becomes painfully obvious the moment these systems are asked to operate outside purely linguistic domains. The false choice holding AI strategy back Much of todays AI strategy is trapped in binary thinking. Either companies rent intelligence from generic models, or they attempt to build everything themselves: proprietary infrastructure, bespoke compute stacks, and custom AI pipelines that mimic hyperscalers.  That framing is both unrealistic and historically illiterate. Most companies did not become competitive by building their own databases. They did not write their own operating systems.  They did not construct hyperscale data centers to extract value from analytics.  Instead, they adopted shared platforms and built highly customized systems on top of them, systems that reflected their specific processes, constraints, and incentives. AI will follow the same path. World models are not infrastructure projects World models, systems that learn how environments behave, incorporate feedback, and enable prediction and planning, have a long intellectual history in AI research.  More recently, they have reemerged as a central research direction precisely because LLMs plateau when faced with reality, causality, and time.  They are often described as if they required vertical integration at every layer. That assumption is wrong. Most companies will not build bespoke data centers or proprietary compute stacks to run world models. Expecting them to do so repeats the same mistake seen in earlier AI-first or cloud-native narratives, where infrastructure ambition was confused with strategic necessity.  What will actually happen is more subtle and more powerful: World models will become a new abstraction layer in the enterprise stack, built on top of shared platforms in the same way databases, ERPs, and cloud analytics are today.  The infrastructure will be common. The understanding will not. Why platforms will make world models ubiquitous Just as cloud platforms democratized access to large-scale computation, emerging AI platforms will make world modeling accessible without requiring companies to reinvent the stack. They will handle simulation engines, training pipelines, integration with sensors and systems, and the heavy computational liftingexactly the direction already visible in reinforcement learning, robotics, and industrial AI platforms.  This does not commoditize world models. It does the opposite. When the platform layer is shared, differentiation moves upward. Companies compete not on who owns the hardware, but on how well their models reflect reality: which variables they include, how they encode constraints, how feedback loops are designed, and how quickly predictions are corrected when the world disagrees.  Two companies can run on the same platform and still operate with radically different levels of understanding. From linguistic intelligence to operational intelligence LLMs flattened AI adoption because they made linguistic intelligence universal. But purely text-trained systems lack deeper contextual grounding, causal reasoning, and temporal understanding, limitations well documented in foundation-model research. World models will unflatten it again by reintroducing context, causality, and time, the very properties missing from purely text-trained systems.  In logistics, for example, the advantage will not come from asking a chatbot about supply chain optimization. It will come from a model that understands how delays propagate, how inventory decisions interact with demand variability, and how small changes ripple through the system over weeks or months.  Where competitive advantage will actually live The real differentiation will be epistemic, not infrastructural. It will come from how disciplined a company is about data quality, how rigorously it closes feedback loops between prediction and outcome (Remember this sentence: Feedback is all you need), and how well organizational incentives align with learning rather than narrative convenience. World models reward companies that are willing to be corrected by reality, and punish those that are not.  Platforms will matter enormously. But platforms only standardize capability, not knowledge. Shared infrastructure does not produce shared understanding: Two companies can run on the same cloud, use the same AI platform, even deploy the same underlying techniques, and still end up with radically different outcomes, because understanding is not embedded in the infrastructure. It emerges from how a company models its own reality.  Understanding lives higher up the stack, in choices that platforms cannot make for you: which variables matter, which trade-offs are real, which constraints are binding, what counts as success, how feedback is incorporated, and how errors are corrected. A platform can let you build a world model, but it cannot tell you what your world actually is. Think of it this way: Eery company using SAP does not have the same operational insight. Every company running on AWS does not have the same analytical sophistication. The infrastructure is shared; the mental model is not. The same will be true for world models. Platforms make world models possible. Understanding makes them valuable. The next enterprise AI stack In the next phase of AI, competitive advantage will not come from building proprietary infrastructure. It will come from building better models of reality on top of platforms that make world modeling ubiquitous.  That is a far more demanding challenge than buying computing power. And it is one that no amount of prompt engineering will be able to solve. 


Category: E-Commerce

 

Latest from this category

13.02Popeyes is losing the chicken sandwich wars. Its comeback plan starts with low-performing locations
13.02Advertising made the internet accessible. Will it do the same for AI?
13.02San Jose just made its buses 20% faster
13.02Anthony Edwards has a plan to get your attention
13.02The U.S. government has 3,000 AI systems in place. Will they fix anything?
13.02How to let go of resentment on the job
13.02How womens skiwear falls short when it comes to actually skiing
13.02How to meet the surging energy demand without needing as much new electricity
E-Commerce »

All news

13.02Theres no vacancy at Hotel Chocolat, but plenty of hot chocolate at the growing Chicago-based British import
13.02Popeyes is losing the chicken sandwich wars. Its comeback plan starts with low-performing locations
13.02Advertising made the internet accessible. Will it do the same for AI?
13.02There's a new John Wick game on the way
13.02The U.S. government has 3,000 AI systems in place. Will they fix anything?
13.02Anthony Edwards has a plan to get your attention
13.02San Jose just made its buses 20% faster
13.02Marvel Tkon: Fighting Souls lands on PS5 and PC August 6 with X-Men in tow
More »
Privacy policy . Copyright . Contact form .