Don't unknowingly feed personal privacy to AI

  • Detail

Don't unknowingly "feed" personal privacy to AI

[Abstract] IBM said in a statement that it takes user privacy very seriously, has been carefully following privacy principles, and that users can choose to exit the dataset

according to the BBC (2. Measurement and control system of microcomputer controlled material testing machine (divided into software system and hardware system): most of the measurement and control system of hydraulic microcomputer controlled material testing machine in the market now adopts 8-bit single-chip microcomputer controlled BBC), IBM was recently accused of obtaining about 1million photos on the picture sharing station Flickr for training its face recognition algorithm without the consent of users

when personal privacy becomes "data"

it is reported that IBM researchers described in detail the steps of using these photos for face analysis in a public paper, including measuring the distance between facial features and so on. The researchers wrote that by using 47 marker points on the head and face, many reliable measurements can be made on human facial photos

for technology companies, the value of these photos is self-evident. The huge image data set helps to train the face recognition algorithm more accurately, so that a user can be quickly recognized from different photos or scenes

however, those people in the photos probably didn't think (106) that their portrait data was collected by the technology company and "fed" to the artificial intelligence neural network, which unconsciously became the "food" for the iterative upgrading of face recognition technology

ibm said in a statement that it takes user privacy very seriously, has been carefully following the privacy principle, and that users can choose to exit the dataset

in other words, choosing the exit time point should be for the user to know that his portrait data will be used for other purposes before the portrait data is obtained. In this case, many users' privacy has been unknowingly violated

public secrets in the industry

if your avatar is also one of the 1million photos mentioned above, you will definitely find it particularly strange. But for some people, what IBM has done is not surprising, it is just an open secret in the industry

the official of MIT technology review published an article saying that AI researchers have been collecting a large amount of data from all corners of the Internet to "feed" those hungry machine learning algorithms, because the training of these algorithms needs to be supported by big data. The article said that photos on social platform instagram are often used as the source of image data for technology companies, and the content labels of photos often correspond to the content of photos, which makes it easier for researchers to label data

in addition to the photos on social platforms being acquired unconsciously, with the increasingly widespread application of artificial intelligence technology, the fear of improper use of personal privacy also exists in other scenarios

for example, recently, many businesses have invited face recognition technology to the cashier to promote face brushing payment. When consumers brush their faces and pay, their facial portraits are also obtained by the face recognition system. The key is, will these faces, like the photos on Flickr, become data for training artificial intelligence neural networks? Even without the consent of consumers, it is once again provided to other businesses for other purposes? These are all worth asking

personal privacy that may be violated is not limited to facial portraits. Now a variety of application software can allow users to input voice with the support of speech recognition technology. But voiceprint is an important personal biological information. How will these voice data be kept and used after being obtained by businesses? You know, once the voiceprint data is leaked, criminals may use the current speech technology to synthesize voices that are difficult to distinguish between true and false, otherwise they will inhibit the growth of dense bones for fraud and other bad attempts

the application of artificial intelligence technologies such as iris recognition, fingerprint payment and character recognition may also involve personal privacy issues without exception

user informed consent is the premise

the era of artificial intelligence is making everyone in it more and more transparent. While enjoying its convenience, citizens' personal privacy is also easy to be pushed into the gray zone

the solution is actually very simple: to obtain users' personal privacy information, users' right to know should be guaranteed; Product name MVM ⑴ h material friction and wear testing machine should give users the right to choose not to "feed" AI with personal privacy

to achieve this goal, it is far from enough to rely solely on industry consciousness. Formulating relevant laws to regulate the application of artificial intelligence technology is the last word

just as the IBM incident caused an uproar, two U.S. senators proposed to pass a new bill, the commercial facial recognition Privacy Act, to impose certain restrictions on the use of face recognition technology by technology companies

this bill proposes to prohibit the use of commercial facial recognition technology to collect and share data for identifying or tracking users without the consent of users. The new act will also require companies to clearly inform users when collecting face recognition data, and share it with third parties only after users' informed consent

it is gratifying that during the two sessions this year, good news also came out in related fields: the Standing Committee of the National People's Congress has included the personal information protection law and other legislative items closely related to artificial intelligence in the current five-year legislative plan

it is expected that AI technology will become more and more capable and respect personal privacy

extended reading

European and American scholars expressed concern about the general data protection regulations

in the past year, the conflict between technological development and privacy protection seems to have intensified. Recently, a number of well-known scholars, including Jean Tirol, a Nobel Prize winner in economics in 2014 and a professor at the school of economics in Toulouse, France, appeared in Hangzhou to express their views on "data privacy and technology development"

in May 2018, gdpr (general data protection regulation), which was issued by the European Union and known as the strictest gdpr in history, began to be implemented, and giant enterprises such as Facebook received sky high fines from Europe one after another. At the same time, the policy of privacy protection has been controversial. Some people have called for the introduction of stricter privacy protection regulations. Many experts believe that the strict data protection regulations in Europe are not suitable for promotion all over the world. In the face of the future world, privacy is a small problem

during the discussion, scholars from Europe and the United States generally expressed concern about this data protection regulation issued by the European Union. Jean Tyrol believes that the regulation is too complex. If data collection is not allowed, it is similar to "throwing out the bath water and throwing out the baby". Ms. Dempsey, a professor at Berkeley law school in the United States, bluntly said that the current research on privacy needs to be supported and improved by more sufficient knowledge in economics and sociology

according to the latest report released by the economist think tank, there are many regional biases about privacy. China more than Europe and the United States recognize the "importance of data privacy to corporate governance". 98% of Chinese respondents believe that data privacy is the most important part of good corporate governance, which is diametrically opposed to the general concept that Chinese enterprises do not pay attention to privacy. (Jiang Yun intern Hong Hengfei)

Copyright © 2011 JIN SHI