Facebook sold us out. Smart contracts can give control back

Facebook sold us out. Smart contracts can give control back

When sharing online, there is an illusion that data can be deleted from the internet and that we as consumers have a right to do so. Sharing online is not a simple transaction, like a bank transaction, that can have direct refunds or "delete" functions. From the moment a post is shared, that content is distributed to servers all over the world and through this process, the ability to delete it entirely from existence becomes nearly impossible. The social media platforms that we use every day have become so ubiquitous because our data fuels the ability of Facebook to make money. It is evident, in the wake of the Cambridge Analytica data breach, that Facebook has been willing to give any application complete access to harvest user information. Within this blatant disregard for the integrity of user data is an easily understood message; Facebook will never care for the security of user information and they never have.

New technology, born out of the cryptocurrency revolution, is positioned to give us more control over the data we share. Smart contracts that have been pioneered within Ethereum, but are now in use in a number of cryptocurrencies, have the ability to easily give social media users a seat at the table to ask for simple controls over their data. Imagine your social media posts are within a building, a building only you have the key to, and you get to define how big the windows are so that others can view the posts within the building. If you are alerted that someone is using your posts in a way you don’t like you can close the blinds, lock the doors, and remove access to the building. Smart contracts can be used in this type of a simple and elegant way to protect your social media posts.

The internet has no delete button

Whenever you share a Facebook post about Trump or your dog, it’s saved multiple times by Facebook’s servers. This is a function of any globally distributed service; ensure that if a group of servers in North America are accidentally deleted, the same content is present on Facebook’s Asian servers. Additionally, it’s easier for a user in Australia to be sent data from a server in New Zealand than it is from a server in the United States. This is the first barrier to fully deleting anything from Facebook -- it’s automatically in multiple places within their system shortly after a post is made. 

Beyond that anyone who wants to take a snapshot of your Facebook post will be able to save that content privately just by saving a copy of that page or taking a screenshot. This is exhibited by Archive.org, which is a non-profit archiver that saves snapshots of web pages as a historical registry. This is a service I have a lot of fun using; just check out AOL.com from 18 years ago! This service shows that web pages can be easily saved, especially if the content has more likes, views, retweets, shares, and other indications that the content was valuable. 

They made the rules, we pay the price

We’ve allowed Facebook to set the rules about how our data can be used because it is a truly revolutionary product that has changed how we interact online. This has been the status quo, because of the return we get from Facebook; pictures of babies from friends and family, browsing events in your community, meeting others in a new city, and the countless other social advantages we get from an online network. There have been times when Facebook’s Terms and Conditions changed and users called for Facebook boycott’s, because of News Feed changes or the threat that we would need to pay for the service, but never for the types of reasons we are confronting now.

Even before the 2016 election, it was obvious to public that social media would be influential to the eventual outcome, but there was no recognition by Facebook itself about its own role in shaping the outcome. It wasn’t completely clear until after the election how influential Facebook and Twitter were in shaping the ideas of voters, and even then we watched Mark Zuckerberg publicly wriggle away from any responsibility of wrongdoing. It took over a year after the 2016 election for Facebook to acknowledge that the platform has some role in how voter ideas are formed, shaped, and possibly changed. However, even then, as the true extent of manipulation of Facebook users was known to Facebook, they still didn’t fully acknowledge the mistakes they made. It took separate investigations by the New York Times and The Guardian to force Facebook to publicly acknowledge that they had allowed user data to be exploited by a company called Cambridge Analytica over a year and a half since that data exploitation had already accomplished a huge project; help Trump become President.

Years before the election, data harvesting and analyses were being undertaken by a company that would later become Cambridge Analytica. This company employed Aleksandr Kogan, a Research Associate at Cambridge University’s psychology department, who developed an academic study that ultimately gathered information from the Facebook user that took the study and granted Kogan access to the user data of all of their Facebook friends. The study’s results, was combined with other publicly available datasets, like voter and census information, to get a “psychographic” profile of over 50 million Facebook users. This dataset was then sold as a political campaign tool used to target increasingly specific subsets of users with advertising that appealed to their profile. 

Changing how we "agree to terms"

The main driver of Facebook's rapid ascension is that they advertised to developers that users around the world were willing to share their information in exchange for experiences online. Facebook made it very easy to sign up for a new service by just clicking the "Sign Up with Facebook" button. This process gives a new service an easy way to get new users in the door and Facebook’s footprint in digital life expands. This process is so ubiquitous now that many people I know continue to keep their Facebook’s just so they can sign in to services online. In the end, consumers demand convenience and will continue to assume that they are not sharing "that much" or that the applications are harmless. It’s now clear that even applications approved to work with Facebook are not harmless and can actually have an intent to exploit our data on a massive scale. 

Signing up for a new service or creating an account on an application always requires an agreement to some sort of Terms of Service or Conditions. These types of agreements are now commonplace and have been truly beneficial for consumers. We can easily ask the question "Did the company break their agreement" and we can more easily reach decisions on that type of question. Of course, consumers have no input on these documents, they are typically far-reaching, and they are rarely ever reviewed. There is a clear need for tools that will empower users to better protect themselves and their data from exploitation online. One such tool that is being actively employed in real-world environments is called a Smart Contract and this tool could revolutionize individual level security online.

Through the use of smart contracts, users on social media could automatically grant access to new applications online, but with built-in protections that would not allow for the data to be used outside of the parameters of that contract. Just like when you are sharing a post on Facebook an audience is decided upon, "Friends or Public?", the idea is similar, just much more secure. This agreement would function very similarly to our current arrangements between you and a service, but the consumer finally gets a seat at the table to negotiate some terms and the data shared with the service can have a significantly higher level of security.

Smart contracts are built on the same revolutionary technology that gave life to Bitcoin, the blockchain, and can begin to test how consumers can be empowered with greater tools to protect their social media data online. The blockchain is a ledger of transactions that is secured by a decentralized network of computers all over the world. This means that there is no central authority that can manipulate, change, or delete transactions like a government or a central bank might be able to do. Traditionally, blockchain technology has only been used to transfer and store value on the internet; in other words, internet money. However, the Ethereum network, launched in 2014, has implemented the same technology for verifying and storing contracts. Traditionally, an impartial third-party (a notary) is needed to verify a contract between people. This third-party can verify that all parties to the transaction are willing participants and then sign the transaction as valid. In the case of smart contracts, software is the third-party validator that signs contracts between people on the Ethereum blockchain.  Smart contracts are not the only answer to making sure that user data online is better protected, but their use can begin to prototype how consumer protections can be put in place to make it much harder for exploitations to occur. 

Self-protection and self-governance are important parts of our daily lives, but they must be thought of as prototypes for how we can expand protection and governance to all. As a person who protects themselves, you lock the doors at night, clean up the dishes, you don't overshare at work about the thing on your leg, you keep your bank password safe, and you call your mom every once in a while! Of course, the government-imposed protections allow all of us to be a bit negligent in our self-protection; didn't lock your doors and got robbed? A well-regulated insurance industry has got your back. Oh, you did overshare about that thing on your leg with Karen at work? Oh, darn, well turns out she referenced you up with an in-network dermatologist in a fully government regulated industry. 

Facebook and other technology behemoths need more oversight, however hastily developed and untested regulation of technology is regressive. The use of smart contracts can begin to prototype how consumers can be empowered with additional rights over their data. Through pushing ourselves towards the understanding that companies like Facebook are not incentivized to self-govern in a way that benefits users, we can begin to test and develop new tools of consumer empowerment. 

Cryptocurrency is a working project and requires skeptics, optimists, revolutionaries, academics, convenience seekers, early adopters, and technophobes alike to participate. Ask questions in the comments and that can help me write more about this in the future!

For Crypto, measuring progress is bigger than price

For Crypto, measuring progress is bigger than price