Xorte logo

News Markets Groups

USA | Europe | Asia | World| Stocks | Commodities



Add a new RSS channel

 
 


Keywords

2025-08-26 23:27:00| Fast Company

Diversity, equity, and inclusion (DEI) has faced a lot of backlash recently. Once celebrated as a win-win solution that tackled systemic injustice and boosted business performance, DEI has become politicized and scrutinized within an inch of its life. As it was happening, those of us working to advance DEI didnt adjust as the ground shifted beneath our feet. DEI was recast as an anti-meritocratic overreach that prioritized identity over skills or qualifications. Whether or not it was true (it wasnt) didnt matter. The new narrative gained traction and dismantling DEI became a political talking point. Then it became policy. Ever since executive orders promised to revoke federal funding from organizations committed to DEI, there’s been a scramble to pivot, roll back, or rebrand DEI programs. My colleagues have been forced to figure out how they can still create space where diversity can flourish and avoid the ire of political actors with dubious motives. The replacement: Pluralism? In a recent The New Yorker piece titled “What Comes After D.E.I.?” writer Emma Green floated a new term as a possible successor: pluralism. Unlike DEI, it has no political baggage, and its academic name and origin give it an air of neutrality. But I believe that neutrality may be a massive problem. I knew pluralism well from my days in seminary. Pluralism is, in the words of Harvard University’s The Pluralism Project, “an ethic for living together in a diverse society: not mere tolerance or relativism, but the real encounter of commitments.” It promotes the enthusiastic embrace and discussion of all viewpoints. When viewpoints clash, pluralism says you should seek to understand opposing perspectives and ways they are valid for those who hold them. Maybe that’s why I bristled when someone first suggested pluralism as a suitable alternative to DEI in business. In academia, there’s an understanding that students are there to learn how to think, reason, and engage differences. Its a space tailor-made for the kind of thoughtful conversations pluralism promises. For pluralism to function there must be grace, and a classroom can offer that in spades. But business is different. Our hiring teams aren’t admissions departments; they’re hiring people for practical skills we expect them to already have. When someone falters in the workplace, a business’s prerogative isn’t to extend grace. It’s to fix the problem, move on, and minimize risk. Pluralism may offer compelling language for navigating systemic injustices, but real social change demands structural and cultural commitments that pluralism isnt designed to make. Pluralism assumes a lot that isn’t often true in business: that we enter dialogues on equal footings, that all perspectives are welcome, that harm can be explored intellectually, and that participants have significant self-awareness and emotional intelligence. But in a workplace, harm is realit is lived and carries consequences. Im curious if the sudden interest in pluralism isnt because its a logical next step from DEI, but because it seems easier. While it isn’t equitable, you can see how well-intentioned execs could sell it as such: “agree to disagree” as corporate policy. But that’s the problem. Pluralism isn’t designed to address inequity or redress harm. It’s a posture for conversation without accountability. It feels like we’re circling back to well-intentioned but misguided thinkinglike the “I don’t see color statements that sought to solve systemic injustices by ignoring they exist. A Band-Aid on an open wound. Create space for real conversations In 2020, many people were shocked to learn there was an infection under that Band-Aid. For some business leaders, the protests following George Floyd’s murder were the first time they felt just how much their colleagues of color were suffering under systemic inequity. For others, it was the first time they asked: What’s my responsibility? How can I be an ally? But people were looking for something they could do in that moment, and DEI was the obvious answerexcept creating a true culture of diversity, equity, and inclusion isn’t something that can happen overnight. It cant just be handed off to HR. When people raised issues some legitimate, others less so we shrugged them off. Without realizing it, we created new divisions that metastasized into today’s backlash. The problem was never DEI; it was how we as leaders chose to implement it. Simply swapping DEI for “pluralism” is punting the real issue. Replacing a word in corporate handbooks won’t build equity culture, and it won’t keep workplaces from falling apart when real conflict hits. As leaders, we’re ultimately on the hooknot for adopting new vocabulary, but for showing up every day with clarity, honesty, vulnerability, and intention. This is the only way to create workspaces where hard conversations aren’t just allowed; they’re expected. If pluralism has anything to offer the business world, it’s as a complement to DEI, not a replacement. Natasha Nuytten is CEO of CLARA.


Category: E-Commerce

 

LATEST NEWS

2025-08-26 23:00:00| Fast Company

When Austin, Josh, and I started Civic Roundtable in 2022, we never thought we’d be an AI company. We had worked in and around government for about a decade, seeing that public servants doing critical work were underserved by technology. We saw opportunities to make it better. In the public sector, the stakes are high: If an agencys technology fails, real people dont get the health services they need, disaster recovery efforts get delayed, and communities lose access to services they rely on. But we also know the right technology can empower public servants to have a bigger impact. Fast forward three years, and AI is everywhere. Chatbots abound and AI widgets appear in new applications daily. But rather than integrating AI for the sake of AI, wed like to suggest a different path: Start listening. Weve spent the last year on the road asking public servants, If technology could free up an hour of your time every day, how would your work change? Weve toured state agencies in California and Texas and met with county officials from Oregon to New Jersey. Weve run brainstorming sessions with public health officials, spent hours learning alongside election administrators, and strategized with emergency management professionals. The result? Some clear ideas about how AI might actually help the public servants behind the healthy functioning of our communities. So for anyone eager to put AI tools to work in support of the public sector, please steal these three lessons, free of charge. 1. AI must address an actual need Starting with AI capabilities puts the cart before the horse. Public servants know their own needs. Some processes are not the result of some imagined inefficiency, but arise from intentional, legally mandated processes. Similarly, certain government functions require human judgment for ethical or democratic reasons. This is a good thing. AI built for public servants should reflect the reality that specific agency workflows differ based on their departments function. For example, officials administering elections have different duties than those implementing programs to provide relief from extreme heat, and these require different technology functionality. Still, useful AI need not limit itself to specific departmental use cases. In our conversations with public servants, we heard again and again that they see clear value in finding information faster. Another area where AI can empower public servants is for repetitive, format-driven tasks, like budget analyses, stakeholder mapping, and memo drafting. A granular understanding of, and empathy with the public servants who best understand this work, means the difference between another AI tool and technology that meets public servants where they are, to help them do more. 2. AI must be reliably accurate One federal official confessed to us, Ive heard that AI is very good at lying. We cant have that. Hes right. AI deployed in government agencies needs to be reliably accurate. Even outside of government, people worry about hallucinations, like when Googles AI confidently told a user to use glue to keep the cheese on pizza. When the stakes are higherpublic health, emergency management, homelessness responseirrelevant or inaccurate information is a deal breaker. One technique to minimize errors, particularly well-suited to government tools, is intentionally limiting the content underpinning AI responses. This means restricting AI tools to exclusively reference resources that are vetted and approved by the government officials themselves. A second approach to enhance trustworthiness is ensuring that responses come with cited sources. When its clear where information is coming from, and easy for public servants to validate those sources, government officials can stand on the firm ground of actual policies, documentation, and their own data without worrying about trusting what AI says. Its okay if a purpose-built government AI tool cant tell you what Taylor Swifts most-streamed single is but can provide state agencies exceptionally precise answers about the resources, points of contact, and proper processes they need to execute their mission. 3. AI must be easy to deploy Deploying AI within a government agency can be a complex effort requiring significant work and IT expertise. AI tools that can specifically query an organization’s own data (known as RAG for retrieval-augmented generation), are a powerful way to increase the accuracy and relevance of AI outputs. But implementing RAG LLMs demands substantial technical competency and careful coordination with information technology teams. Most government agencies, especially those on a state, county, and local level, don’t have teams of developers waiting to integrate a custom API or put in work to configure a new platform. They need solutions that work from day one with minimal implementation overhead. Government workers are sophisticated users with complex needs, but they don’t have the luxury of a complex implementation. Technology that serves them well is ready to work immediately, with sensible defaults and clear documentation. Let public servants point the way These practical requirements underscore something deeper we learned on the road: You can’t build for government if you don’t listen to government workers and understand that they’re working toward a mission, trying to make a difference, and striving to have an impact. Sometimes this looks like visible acts of heroism, such as responding to natural disasters. Sometimes this work is largely invisible, like ensuring adequate distribution and funding for medical care and services in areas most in need. This should inspire technologists. Its profoundly rewarding to build tools that help people navigate complex systems to get the help they need, or that make it easier for dedicated public servants to do their jobs well. Every efficiency gain translates to faster disaster response, better benefit processing, or improved community services. The public sector deserves technology that’s built for the mission, not retrofitted from consumer applications. When we take the time to understand that missionand the real requirements that come with itwe can build tools that don’t just work, but actually make government work better for everyone. Madeleine Smith is cofounder and CEO of Civic Roundtable.


Category: E-Commerce

 

2025-08-26 23:00:00| Fast Company

In todays professional landscape, we face a paradox. Even as businesses increasingly seek candidates who have mastered adaptive skills like navigating ambiguity, communicating effectively, and demonstrating resilience through setbacks, our classrooms remain largely fixated on teaching content knowledge that AI can now provide instantly. This skills and readiness disconnect isnt new. Consider that most corporate leadership development programs have emphasized many of these same adaptive skillsaka soft skillsfor decades. The difference is we can now see the disconnect more clearly, and the consequences of inaction are dramatic. AI has become an X-ray for our education system, revealing critical fractures that have long been masked by traditional assessment methods. When information is universally accessible, success increasingly depends on developing adaptive skills that our current educational approach has struggled to prioritize because theyre notoriously difficult to teach and measure at scale. With AIs proliferation, this disconnect will become a chasm if we dont address it now.  3 pillars for instructor-enterprise collaboration The why behind addressing this disconnect is straightforward. Classrooms are the source of workforce readiness. The how is more complex but represents an unprecedented opportunity for instructors and enterprises to develop solutions together and learn from each other. I see three key opportunities for collaboration. 1. Use AI-powered technology to measure vital adaptive skills. Enterprises have long recognized the value of adaptive skills, but theyve struggled to reliably evaluate them during hiring processes and performance evaluations. Similarly, educators understand the importance of these skills but lack scalable methods to teach and assess them. This shared challenge presents an opportunity for collaboration. AI technologies like sentiment analysis, natural language processing, and behavioral pattern recognition can revolutionize how we measure previously intangible qualities. Imagine environments where learners receive real-time feedback on communication effectiveness, collaboration patterns, or problem-solving approachesnot just whether they arrived at the right answer. By collaborating to develop these assessment technologies, enterprises can help instructors understand which specific behaviors and capabilities correlate with workplace success, while instructors can provide insights into how these skills develop over time. The result is graduation requirements that reflect workforce needs and hiring practices that meaningfully evaluate candidate readiness. 2. Create immersive learning laboratories to develop AI fluency. Both traditional and workplace learning environments are facing what Omid Fotuhi, director of learning innovation at WGU Labs, calls the AI Trolley Problem. It borrows from the classic ethical thought experiment which asks whether one should pull a lever to redirect a runaway trolley, sacrificing one life to save five. Fotuhi uses the metaphor to describe a form of institutional paralysis: a deep discomfort with taking action that might cause harm, even when inaction guarantees it. When it comes to AI, theres a tendency to fixate on what might go wrong if we act, Fotuhi explained to me. But we rarely consider what might go wrong if we dont. This aversion to action, though rooted in caution, can quietly perpetuate harm. Failing to implement AI tools could also mean missed opportunities to provide more personalized learning, reduce burnout among educators, or close equity gaps at scale. In many cases, the cost of doing nothing is not neutralit’s compounding. We need to shift the frame, Fotuhi added. Yes, using AI carries risk. But so does sitting still. If we only focus on the potential harm of pulling the lever, we ignore the damage being done by letting the trolley barrel forward. One way of overcoming the risks could lie in creating collaborative learning environments where learners and professionals can safely experiment with AI tools. Enterprises can provide real-world business challenges and access to industry-specific AI applications, while instructors contribute pedagogical expertise and learning environments where failure carries no permanent consequences. These immersive learning laboratories would serve dual purposes: Help learners develop practical AI fluency theyll need in future careers, while giving enterprises insights into how next-generation workers approach and leverage these tools. Such environments would help foster the metacognitive abilities to determine when and how to best leverage AI. Those are skills that no machine can replicate. 3. Establish continuous feedback loops between learning environments and workplaces. The pace of technological change demands a more dynamic relationship between classrooms and enterprise than our current system allows. Annual curriculum reviews and occasional industry advisory boards are insufficient when workforce needs evolve monthly rather than yearly. We need continuous, bidirectional feedback mechanisms where learning innovations inform workplace practices and workplace needs shape learning priorities. This means embedding instructors within businesses and bringing industry professionals into classrooms as integral contributors to the learning ecosystem. AI can facilitate this exchange by aggregating and analyzing real-time data about skill demands across industries, helping instructors understand emerging trends before they become mainstream requirements. Simultaneously, instructors can share insights about learning approaches that effectively develop adaptability and resiliencequalities that enterprises increasingly recognize as essential for organizational agility. A more collaborative future At Udemy, we believe instructors must prepare learners not to compete with AI, but to leverage it effectively. This requires a fundamental shift from mere knowledge acquisition to developing the metacognitive abilities and street smarts that machines cannot replicate. This vision for the future extends to instructors, too. Far from replacing them, AI is poised to elevate their role. By automating administrative tasks, enabling personalized support at scale, and creating new ways to teach and measure adaptive skills, it offers opportunities to create learning environments where instructors have the time and tools to set learners up for success in an AI-powered world. For enterprises, this partnership with instructors means investing in the future workforce by contributing expertise, challenges, and resources to classroom innovation, rather than lamenting skills gaps after they emerge. The disconnect between instructors and enterprise is not new, but AI has simultaneously amplified its consequences and offered powerful solutions. By working together to reimagine education with AI as an enabling force rather than a threat, we ca build learning environments that prepare learners for a lifetime of adaptation and growth. Hugo Sarrazin is CEO of Udemy.


Category: E-Commerce

 

Latest from this category

27.08No salary information in the job posting? Hard pass
27.08Inside the new AI assistant wars
27.08Fantasy football no longer belongs to the boys
27.08Connection is a strategy, not a sentiment
26.08What really comes after DEI?
26.08We asked government workers about AI
26.08The disconnect between skills and readiness in the workforce
26.08Elon Musk has only one chance of forcing Apple to promote Grok
E-Commerce »

All news

27.08Sea border for food and agricultural products 'in place until 2027'
27.08US firms said to eye Pakistan oil after Trumps reserves claim
27.08West said to be fretting over China's interest in Vietnam tungsten mine
27.08No salary information in the job posting? Hard pass
27.08UN nuclear watchdog chief says inspectors 'back in Iran'
27.08Kitchen Cosmo blends AI and analog charm to whip up recipes from whatevers in the fridge
27.08Inside the new AI assistant wars
27.08Tommy Coyle to hand out free school uniforms
More »
Privacy policy . Copyright . Contact form .