A model called Google Arts
To get the public
Though powerful, you probably didn't expect that after a year's publication, now this app has become the first place to download iOS and Android free apps since last week.
Specifically, users will use their own photo uploading applications to automatically find pictures that are similar to users. However, according to a spokesman for Google, the function is currently only being launched in parts of the US, and there is no plan to go online in other parts of the world.
Micro-blog from Gao Xiaosong
The fun self timer has attracted many users, but it also exposes problems.
Benjamin Goggin, an editor from Digg News, noted that many colored people found that the art works that match their self portraits were relatively limited. Some people find that works matching them have fixed bias, such as colored people are often slaves, servants, etc. women are the role of erotic novels.
A Google spokesman said,
At the same time, the race problem caused by this self timer has been reported by the media, such as Bustle, BuzzFeed, Marketwatch, and so on. Not surprisingly, it has also caused a lot of privacy issues. When uploading the self timer money, the application will present a message. Google tells the user that it will not use self timer data as his use, only to find matching art works, and will not store photos.
However, from Google Art
In 2015, Mellon Foundation launched the first survey of the diversity of American museums with the assistance of the Museum of Art Directors Association and the American Museum alliance. The survey results show that 84% of the museum management positions are served by white minority, less junior staff in the museum, the Mellon Fund said that this shows that if the museum hopes to the future leaders of ethnic enough diversity, now need to actively cultivate young talent. In addition, there are more girls in art schools than boys, but contemporary art exhibitions are still dominated by men.
If this is a problem left over by history, the algorithm is now not only alleviating the problem, but also worsening the problem.
Because the algorithm can not protect users from racial bias, instead of absorbing and learning, amplifying and spreading prejudice. At the same time, it creates the illusion that technology is free from human prejudice. Facial recognition algorithms have proven how harmful they can be. For example, two black users found their photos posted
The accuracy of the algorithm with the reference data set, and these data sets reflect the data collector bias, whether intentionally or unintentionally. Joy Buolamwini, a graduate from MIT, is studying this problem. He had previously founded Algorithmic Justice League to prevent prejudice from being encoded into software, which will have a huge impact on racial identification and civil rights. Last year, in the TED speech, the black Buolamwini told me how she quickly recognized her experience when she took the white mask.
The development of face recognition algorithms echoed the prejudice and color film. In 1950s, Kodak sent female model cards to the photo lab to help them calibrate their complexion. All models are called Shirley, and all the models have been white people for the next few decades. This means that the black image is often misinterpreted or failed to be truly described.
Now, through Google Art,