Did Microsoft compromise on ethics for AI progress?

Kerem Gülen
Mar 14, 2023
Updated • Mar 14, 2023
Microsoft
|
0

Microsoft has purportedly terminated its complete ethics and society team within the artificial intelligence organization as a component of a comprehensive round of dismissals that have affected 10,000 personnel across the company, Platformer reports.

As a leading force in rendering AI tools available to the mainstream market, the absence of a dedicated team to ensure the company's AI principles are closely integrated into product design is a critical apprehension.

Microsoft still upholds an active Office of Responsible AI responsible for framing rules and principles to oversee the company's AI initiatives; nonetheless, the elimination of the ethics and society team highlights the significance of a specialized team devoted to ensuring ethical considerations are rooted into product design.

Despite this impediment, the company contends that its total investment in responsibility work is escalating, notwithstanding the recent dismissals.

“Microsoft is committed to developing AI products and experiences safely and responsibly, and does so by investing in people, processes, and partnerships that prioritize this,” Microsoft said in a statement. “Over the past six years we have increased the number of people across our product teams and within the Office of Responsible AI who, along with all of us at Microsoft, are accountable for ensuring we put our AI principles into practice. […] We appreciate the trailblazing work the Ethics & Society did to help us on our ongoing responsible AI journey.”

According to employees, the presence of the ethics and society team was regarded as crucial in guaranteeing that Microsoft's responsible AI principles were thoroughly incorporated into the design of products that were released.

“People would look at the principles coming out of the office of responsible AI and say, ‘I don’t know how this applies,’” one former employee says. “Our job was to show them and to create rules in areas where there were none.”

The "responsible innovation toolkit," established by the ethics and society team, encompassed a role-playing game known as Judgment Call, which facilitated designers in predicting likely damages that could surface from AI and deliberating on them during product development.

The toolkit was posted publicly, making it accessible to all Microsoft employees. In more recent times, the team was focused on identifying the risks associated with Microsoft's integration of OpenAI's technology across its suite of products. Their work on this front highlights the importance of anticipating and addressing the potential ethical implications of AI usage.

What happened since 2020?

In 2020, the ethics and society team attained its zenith with nearly 30 personnel, comprising engineers, designers, and philosophers. Nonetheless, in October of the same year, the team underwent a noteworthy reorganization, causing its workforce to shrink dramatically to just seven members.

After the restructuring, John Montgomery, the Corporate Vice President of AI, conducted a meeting with the team to convey the exigency of their actions. Montgomery imparted that the company's leaders had instructed them to execute their tasks expeditiously and proficiently.

“The pressure from [CTO] Kevin [Scott] and [CEO] Satya [Nadella] is very, very high to take these most recent OpenAI models and the ones that come after them and move them into customers hands at a very high speed,” Montgomery stated.

Because of the intensified pressure to produce results, Montgomery elucidated that a considerable fraction of the team would be transferred to other departments within the organization. Nevertheless, some members of the team articulated their disagreement and objection to this resolution.

“I’m going to be bold enough to ask you to please reconsider this decision. While I understand there are business issues at play … what this team has always been deeply concerned about is how we impact society and the negative impacts that we’ve had. And they are significant,” one employee stated.

When confronted with the team's pushback, Montgomery stood firm on his decision, saying, "Can I reconsider? I don't think I will," citing the continued pressure to deliver results. He acknowledged that the situation may not be visible from the team's perspective and remarked that "there's a lot of stuff being ground up into the sausage." However, despite this, Montgomery clarified that the team's elimination is not part of the reorganization plan.

“It’s not that it’s going away — it’s that it’s evolving,” he said. “It’s evolving toward putting more of the energy within the individual product teams that are building the services and the software, which does mean that the central hub that has been doing some of the work is devolving its abilities and responsibilities.”

Subsequent to the reorganization, most of the team members were redeployed to different departments within Microsoft. Those who stayed in the ethics and society team disclosed that the reduced headcount posed challenges in executing their ambitious plans, citing obstacles in implementing their initiatives due to the limited workforce.

This action renders Microsoft without a specialized team to guarantee the close integration of its AI principles with product design, a critical void considering the company's endeavor to render AI tools accessible to the mainstream. This circumstance underscores the dilemma that large tech corporations encounter in balancing innovation with ethical accountability.

Advertisement

Tutorials & Tips


Previous Post: «
Next Post: «

Comments

There are no comments on this post yet, be the first one to share your thoughts!

Leave a Reply

Check the box to consent to your data being stored in line with the guidelines set out in our privacy policy

We love comments and welcome thoughtful and civilized discussion. Rudeness and personal attacks will not be tolerated. Please stay on-topic.
Please note that your comment may not appear immediately after you post it.