Microsoft supposedly can look into conversations on Bing Chat: Okezone techno

JAKARTA – There has been a lot of talk recently about OpenAI’s ability to snoop on the conversational content of ChatGPT users, now similar accusations have been leveled at Microsoft. The tech giant is reportedly even capable of saving all the content of Bing Chat users’ conversations.

Launch Mashable, Thursday (8/17/2023), this allegation cannot be separated from the new Microsoft policy introduced last July. The policy states that Microsoft will process and store entered user data starting September 30th.

“As part of providing artificial intelligence services, Microsoft will process and store user input to the service, as well as the output of the service, for purposes of monitoring and preventing abusive or harmful use or output of the service “said Microsoft.

It’s unclear how long conversations will be kept, but according to the Register’s reading of the new “AI Services” clause in the terms of service, Microsoft can save a user’s conversations with Bing if the user is not on behalf of an agency.

Microsoft did not immediately respond to a request for comment regarding the reports, and a Microsoft spokesperson declined to comment. The company only said that this step was taken to provide better products and services to users.

“We regularly update our terms of service to better reflect our products and services. Our latest updates to the Microsoft Services Agreement include adding language to reflect artificial intelligence in our services and their appropriate use by customers,” a Microsoft representative said.

Follow Okezone news on Google News

In addition to data retention, there are four additional policies in the new AI Services clause. The first prevents users from using artificial intelligence services to discover the underlying components of models, algorithms and systems.

The second policy is that users are not allowed to mine data from AI services. The following policy permits users not to use AI Services to directly or indirectly create, train, or improve other AI Services.

Finally, you are fully responsible for responding to any third-party claims regarding your use of the Artificial Intelligence Services in compliance with applicable laws, including, but not limited to, copyright infringement or other claims related to content output .

If this conjecture were correct, it would be a problem if chats were confidential. But policies are fine as long as users never talk about anything they don’t want others to read on Bing Chat.

The following content is presented by the advertiser. journalists are not involved in this content material.

Quoted From Many Source

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button