Home > News content

Linkedin founder: FB's existing business model is difficult to protect privacy

via:博客园     time:2018/5/14 9:32:04     readed:569

orgsrc=https://images2018.cnblogs.com/news/66372/201805/66372-20180514090740654-54685653.jpg

Eric Ly, co-founder and former chief technology officer of professional social networking site Linkedin

Tencent's "First Line" Ji Zhenyu was sent from Silicon Valley on May 11

Since the explosion of Facebook’s massive user data breaches, there has been widespread discussion of how to protect user privacy data. Recently, Eric Ly, co-founder and former chief technology officer of Linkedin, a professional social networking site, published an article saying that under the current business model, it is difficult to fundamentally solve the problem of user data leakage.

Ly believes that Facebook's business model largely depends on the accuracy and effectiveness of advertising to users, and therefore, depends on the quality of the data collected. Users only need to quickly check their own settings on the software to know the breadth and depth of personal information collected by social networks. Facebook not only stores your personal information, likes lists of history and friends, but also collects favorite content of users, such as movies and music, as well as family, contact lists, personal e-mails, and conversations with loved ones.

He said that although Facebook took a series of measures after the incident, it is regrettable that, under the existing model, it cannot fundamentally solve the problem of protecting users' privacy data.

Ly believes that in the absence of government oversight, the revolution in technological innovation in the entire social system is imperative - so that users can still protect the security of their personal data when they can share and make contact with each other.

He proposed that blockchain technology can be used in the social field to solve the above problems. The Hub, which he co-founded with the team, is a set of human trust agreements based on blockchain technology. Only authentic and verified accounts on this system can establish reputation files through real interaction. The data is owned by the users themselves. , only exchange with the outside world under certain circumstances, and the reduction of company reputation and credibility will result in the revocation of its data access rights.

Eric Ly's article reads as follows:

Facebook and Cambridge Analytics, a British political consulting firm working for the Trump campaign team, have been caught in deep trouble and a crisis of confidence in the past few weeks. The New York Times and London observers concluded that Cambridge Analytics has collected more than 50 million Facebook personal user profiles and related privacy data to develop technologies that can predict American voter behavior. Facebook's business model largely depends on the accuracy and effectiveness of advertising to users, and therefore, depends on the quality of the data collected. Users only need to quickly check their own settings on the software to know the breadth and depth of personal information collected by social networks. Facebook not only stores your personal information, likes lists of history and friends, but also collects favorite content of users, such as movies and music, as well as family, contact lists, personal e-mails, and conversations with loved ones. As recent incidents of hacking by Equifax and Yahoo have shown, centralized data sets often lead to large-scale leaks. A test program built by university researchers in 2014 can leak tens of millions of Facebook data to political consulting firms. Obviously, social networking is the time to make changes in this area.

Unfortunately, trying to solve this problem in Facebook's current business model is difficult, if not impossible. In an interview with CNN, Zuckerberg said: "This incident is a major violation of trust and I am very sorry about the occurrence of this situation." "This led to Facebook's removal of advertisements from the New York Times, Washington Post and Wall Street Journal, and promised to review data policies for thousands of applications."

However, the large scale of Facebook's operation makes the monitoring of the account, the reputation and credibility of the application become onerous and unsatisfactory. Even though Facebook has increased its network security staff from 15,000 to 28,000 people (no doubt already astronomical), each employee is still responsible for the security of approximately 71,000 users: the number is indeed too large.

Moreover, due to the lack of adequate user knowledge and consent, the privacy concerns of the Facebook platform have worsened. When users agree to participate in a 90s comedy character preference test, many people simply don't know their personal data, and their friends' data, will be passed on to Cambridge Analytics (unless they read the Facebook API completely from beginning to end. Documentation). In addition, even if the security personnel flag the application as suspicious, the test can be removed and hundreds of replicas replaced to continue collecting user information. Therefore, unless users spend a great deal of extra effort and energy to analyze the data collection policies of these companies, they simply cannot prevent exploitative attacks against their personal data and privacy.

In addition, even in this context, the U.S. government has not been able to help users understand when their data is being collected. The main reason is that the United States is still one of the major countries in the world that lack this legislation. In 2016, the U.S. Congress withdrew its FCC regulations, which would have prevented internet service providers (ISPs) from selling user data without user consent. In addition, legislators refused to take any action against Silicon Valley, claiming that the regulations were intrusive and stifle innovation.

Therefore, in the absence of government supervision, a revolution in technological innovation is imperative - in order to allow the majority of users to protect their personal data when they can share and make contact with each other.

Blockchain came into being: This is a decentralized distributed ledger and database that everyone can access and control who owns their information. Our team is building the Hub-Human Trust Agreement on this system:

1) Authentic, verified accounts establish reputation files through real interactions.

2) The data is self-owned by the user himself and is only exchanged with the outside world under certain circumstances, and

3) The reduction of corporate reputation and credibility will result in the withdrawal of their data access rights.

The Hub Human Trust Protocol and Hub Digital Currency were created to address virtual trust issues, allowing users to assess long-distance trust across applications. The value of the Hub Human Trust Agreement is not to extract and use user data as Facebook does, but to allow users to build trust and reputation in a decentralized manner.

Trust is built into the Hub network from the very beginning, and the Hub builds trust by motivating positive and trusted interactions between users. Reputation data is only available when the owner approves it, allowing users to share relevant data in a respected manner. In addition, the Hub Human Trust Agreement will allow users to access the reputation data of other people or organizations with whom they conduct online transactions, prohibiting access to their information without prior permission. Therefore, the disclosure of any private information will be completely in the hands of users who have personal data.

Therefore, the Hub Human Trust Agreement will promote and maintain the trust and reputation among multiple parties without trust, establish formality by allowing participants to invest in trust in the interaction, and economically motivate users and organizations to engage in fairness on the Internet. Trusted behavior. Equally important, the Hub will work to address the Sybil attacks and continue to improve its economic structure as human confidence agreements grow.

The protection of personal data and privacy has tremendous potential value for users. However, centralized companies such as Facebook still cannot solve the problem of community security, and user groups’ data and privacy will continue to be unreasonably used in the current ecosystem. In a world in which bad network actors increasingly have a negative impact, decentralized organizations need to take the initiative to come forward and create a new order that can solve these problems and pain points. The Hub Human Trust Agreement strives to be the leading and unprecedented solution.

China IT News APP

Download China IT News APP

Please rate this news

The average score will be displayed after you score.

Post comment

Do not see clearly? Click for a new code.

User comments