Attached: 1 image
So, Microsoft is silently installing Copilot onto Windows Server 2022 systems and this is a disaster.
How can you push a tool that siphons data to a third party onto a security-critical system?
What privileges does it have upon install? Who thought this is a good idea? And most importantly, who needs this?
#infosec #security #openai #microsoft #windowsserver #copilot
Exchange allows users to access data and Microsoft services and it comes with good documentation and a whole slew of controls for org admins.
Active Directory provides authentication services, and it is mostly for your internal users (so they can access org services, including Exchange), but it’s very common to allow guests and to federate under certain circumstances, so your AD talks to their AD and external guests can authenticate and use resources that have been shared with them.
It is also well-documented with tight control in the hands of administrators.
Copilot is a black box. Their terms of service are vague. Microsoft’s responsible AI website comprises of marketing speak, no details, and the standards guide on the site is mostly questions that amount to “TBD”. Administrative ability to control data sharing is non-existent, not yet developed, or minimal.
We don’t know the scope of data gathered, the retention and handling policies, or where that data/any models built from that data are going to wind up.
My read is that they’re waiting to be sued or legislated before they impose any limits on themselves.
Exchange allows users to access data and Microsoft services and it comes with good documentation and a whole slew of controls for org admins.
Active Directory provides authentication services, and it is mostly for your internal users (so they can access org services, including Exchange), but it’s very common to allow guests and to federate under certain circumstances, so your AD talks to their AD and external guests can authenticate and use resources that have been shared with them.
It is also well-documented with tight control in the hands of administrators.
Copilot is a black box. Their terms of service are vague. Microsoft’s responsible AI website comprises of marketing speak, no details, and the standards guide on the site is mostly questions that amount to “TBD”. Administrative ability to control data sharing is non-existent, not yet developed, or minimal.
We don’t know the scope of data gathered, the retention and handling policies, or where that data/any models built from that data are going to wind up.
My read is that they’re waiting to be sued or legislated before they impose any limits on themselves.