Time and Attention | The History of Social Media

“Distracted from distraction by distraction.” – T.S. Elliot


The first attempt at what would later be defined as “social media” took place in the fall of 1969. The United States military was conducting tests using the first proto-internet, called ARPANET, which was used to create connections between computer stations. As Charles Herzfeld, ARPA Director (1965–1967) described it:

“The ARPANET came out of our frustration that there were only a limited number of large, powerful research computers in the country, and that many research investigators, who should have access to them, were geographically separated from them.”

This pre-internet attempt spawned the famous communication – or lack of communication – which occurred in October of 1969 from the campus of UCLA. An attempt was made to send the word Login between two computers on campus, but a computer crash resulted in the letters LO being received. This first miscommunication did not stop the developers from continuing their work and successfully sending the message a month later. This achievement, both the failed communication and the successful one, would have a lasting effect on social media, whether those developers intended it or not.    

The Emergence of Social Sites

Though the ARPANET project closed in 1990, internet use skyrocketed. Its expansion through the 1990’s reopened the desire for better connectivity between users, and though this desire was satisfied in the forms of chatrooms and direct messaging, the push for a social media network never lost its momentum. 

In 1997, a service called Six Degrees began. This, the first of modern social media, peaked at one million members in its heyday. Users could create profiles and friend one another in order to stay connected with old friends or make new ones. Livejournal, a social platform that allowed users to create short blogs to keep friends and family updated about their lives, started in 1999 as a competitor to Six Degrees. These two networks would rein within the social realm until 2002 when Friendster would emerge and like the two previous competitors, would gain a substantial following.

Building upon these successful concepts of social networking, LinkedIn was created in 2003 for professionals, connecting over 1 million users within the first year. MySpace broke the mold in personal sharing in parallel to the emergence of Facebook, Twitter, and other social network sites, with over 100 million accounts by 2006. As time went on however, usability and simplicity gained prominence over features. And as the MySpace bubble deflated, Facebook, Twitter, and YouTube exploded. More competitors entered the market as the opportunity for engagement increased.

Today, Facebook has over 2.4 billion users, YouTube 1.5 billion, Instagram 700 million, and LinkedIn over 450 million. As more users engage with these and other social platforms, these numbers will only increase.

The history of modern social media is fairly short, and as a result, you may be wondering why we would cover such a recent event in cultural history. The reason is not necessarily to report numbers or detail history you may already be familiar with, but rather to explain the reason why the concept and use of social media has become so prevalent in such a short period of time and what effects it may have on our culture as a whole.


Facebook was founded in 2004 by Mark Zuckerburg. A Harvard student at the time, Zuckerburg’s stated intention for creating what was then called, thefacebook, was to build out the student directory at Harvard and connect students from across the campus. At its onset, membership was limited to Harvard students only, but in March of 2004, membership expanded to Stanford, Columbia, and Yale. Later that year, Sean Parker invested in Zuckerburg’s idea, incorporated the company, and dropped the “the”, coining the name that we all now know as Facebook.

In 2005, Facebook expanded to include more universities, eventually expanding into high schools and being offered – and accepted – by many colleges and universities overseas, including Canada, the United Kingdom, and Ireland. In 2006, Facebook expanded beyond the realm of educational organizations, and opened up to anyone who was 13 years of age or older. By 2008, Facebook had over 100 million members.

Surprisingly, monetization was not on the radar for the founders in Facebook at its initial stages. Investments fueled the growth of the company, with investors being spurred by the belief that profitability was inevitable with such a large audience willing to make themselves captive on one venue. It was only after Facebook hired Cheryl Sandberg in 2008 did advertising become the primary revenue generator for the company. After all, what else would you do with that kind of captive audience?

Eventually, Facebook would file for an IPO, going public in an effort to raise more capital for continued acquisitions, investments, and expansion.

Unlike twitter, LinkedIn, and Instagram, Facebook was developed to provide an opportunity to do a “deep dive” into the relationships that are digitally acquired. While some use it to stay in touch, others use to connect and collaborate, while others just use it to argue and complain. However users intend to employ it, Facebook offers the most in opportunity to engage with others. The longer status allows for more content which inherently provides more context, and with the ever-expanding options on how to dress up a status, these pieces of content have defined the digital experience on Facebook as a whole, which stands in contrast to the second most popular social media channel, twitter.


The word twitter, is ironically defined as a ‘short burst of inconsequential information’ or ‘a chirp from birds’. While its easy to make many parallels – and jokes – about the validity of this definition and how it reflects current twitter use, the overall concept - and challenge – to communicate an idea in a specific number of characters fundamentally defined the user experience on Twitter, and how the channel evolved.

While it was initially conceived in 2004 by Jack Dorsey to link up small groups of people and keep them connected, Twitter was not actually founded until 2007. Its popularity surged during the SXSWi conference that year, where the founders put up two massive plasma screens of constant tweets running all hours of the day. The perception generated interest, as attendees wanted to join in on the ongoing digital conversation.

Growth accelerated over the next few years as the service continued to expand, with over 65 million tweets posted each day, and 750 tweets posted per second by 2010. The ever-present 140-character status requirement organically created a new way of communication on the channel. With such a requirement, users had to choose their words carefully so that readers, in turn, could assimilate the information with equal consideration and speed. This rapid-fire approach to communication came to the define Twitter, and influenced that habits – both good and bad – that we see today.

Though the character limit has expanded options for the user, the fundamental way communication is conducted on Twitter remains the same. Short bursts. Quick assimilation. Whether the information is consequential is up to the user.


Differences between social channels can be summed up by the type of status they allow. Imagine yourself in a conversation with a person. If you have the opportunity to give longer explanations about specific ideas and topics, there is a better chance for a continued conversation or bond to develop? In contrast, quick bursts of communication in a conversation generate specific types of responses, and the message is the only consideration, rather than the relationship itself. What if the conversation included you sharing a piece of information with the person such as a book, a link or an image? How would the interaction change then? While exceptions certainly exist, the type or length of the message being shared will drive a response which is inherently based on that length. Quick comments get quick comments in return. Longer comments get longer answers.

For Facebook, status length is up to the user, as are attachments and whatever images or links that are included. Longer status means more can be communicated, thus allowing the possibility for deeper conversation and connection. In contrast, Twitter has a restricted status length, and though many updates have been made to provide more options, the “get to the point” mentality of Twitter – governed by the restricted status length – allows for more opportunities to connect with more people, thus deemphasizing the depth of said connections. LinkedIn is a social network for professionals, with few limits on status’s but less options than that on Facebook, while Instagram – which is owned by Facebook – provides an image-based status option, which can accomplish the deeper context, while providing a visual that encapsulates the intended experience.

Social media channels are conversation facilitators. Its important to remember that, especially if you are a business, who tend to look at social – and everything – as conversion tools. Social gives users the opportunity to connect with others in the specific way that each social channel affords.

Drivers for Social Media Use

Reason #1 Technology

The first and most obvious reason for social media adoption is technology. When Sixdegrees and Livejournal launched, the internet was still in its infancy. Bandwidth was limited, and the computers were not nearly as powerful as they were today. More importantly, mobile technology had stalled to the point where focus was being placed on phone size and convenience over that of functionality. Applications were limited to simple games, such as Blockade (or Snake) which showed up on the Nokia 6110 in December of 1997. It wasn’t until the launch of the iPhone in 2007 and the Apple Store in 2008 that the modern-day application was realized, with over 100 million apps being downloaded in the first sixty days. That same year, Android launched its own version of the App Store, called Android Market. In 2011, Facebook launched its own application, soon to be followed by Twitter in 2012. As the technology developed, so did the user’s ability to connect to these networks, and in the end, to one another.

Reason #2 Psychology

The convenience wrought by modern technology, both desktop and mobile, played a significant role in the surge of social media use. But the second and more important reason for its adoption has been considered much more complex and as some have said, much more sinister.

The question of social media use was tackled by Harvard in 2012. Through two test groups, researchers monitored the parts of the brain that were utilized. The first group of participants was asked to share about themselves. The second group was asked to share about someone else and exclude themselves.

In the first test group, the frontal lobe was activated, and the dopamine pathway engaged. The frontal lobe is called the reward center, and is utilized when we received gifts, have sex, or eat something that we really enjoy. Dopamine functions as a neurotransmitter and plays a major role in reward-motivated behavior.

For the second test group, you know, the ones who were sharing about others, none of these functions were activated. The action was seen more as a task, and the users were bored.

As a result, researchers determined that those sharing about themselves felt intrinsically rewarded to do so, even if what they were doing served no practical purpose. This “intrinsic reward”, has been confirmed recently by ex-Facebook President and co-founder, Sean Parker.

 "The thought process that went into building these applications, Facebook being the first of them ... was all about: 'How do we consume as much of your time and attention as possible?' And that means that we need to sort of give you a little dopamine hit every once in a while, because someone liked or commented on a photo or a post or whatever. And that's going to get you to contribute more content, and that's going to get you ... more likes and comments. It's a social-validation feedback loop ... exactly the kind of thing that a hacker like myself would come up with, because you're exploiting a vulnerability in human psychology. The inventors understood this consciously," he said. "And we did it anyway."

Using Your Information

Today, people are sharing everything about the lives. Every 60 seconds on Facebook: 510,000 comments are posted, 293,000 statuses are updated, and 136,000 photos are uploaded. Over 300 million photo uploads occur per day on the various social channels. And these numbers are always increasing.

Due to the psychological drive to share, and the technological capability to do so, people are sharing everything about themselves, from new jobs and births to the daily struggles they face. Brands have capitalized on these types of shares, employing brand ambassadors to use products or talk about services to their millions of followers. A great example of this is Dwayne the Rock Johnson, who loves to utilize Instagram to speak directly to his 110 million followers about his movies, his motives, and his Under Armor products (which are kind of awesome).

There are many downsides to this, now growing, psychological need to share about our lives. Once something is posted, its there forever. Employers, friends, girlfriends, boyfriends, and family have access to your posts and while those that know you may be able to put questionable posts into context, while those that don’t know may not. Additionally, social media users do not have the ability to control how the information they’ve shared can be used. Much like the ability of a police officer to search a see-through bag, social media considers nothing you share to be private, because, after all, you shared it on a public space.

In 2018, Facebook came under fire for allowing a data firm called Cambridge Analytica to harvest data from millions of Facebook profiles for the purposes of promoting then candidate, Donald Trump. The permission to scrap these profiles called into the question the rights of the user and how Facebook utilizes the voluntary submission of personal information, likes, interests, and other data.

Interestingly enough, Facebook did not come under fire for doing the very same thing with the Obama Administration in 2011, with officials from that campaign even publicly bragging about working with Facebook, and the different tools/systems employed to target prospective voters.

 “We ingested the entire U.S. social graph,” Carol Davidsen, director of data integration and media analytics for Obama for America told the Washington Post. “We would ask permission to basically scrape your profile, and also scrape your friends, basically anything that was available to scrape. We scraped it all. Facebook was surprised we were able to suck out the whole social graph, but they didn’t stop us once they realized that was what we were doing. They came to the office in the days following election recruiting, & were very candid that they allowed us to do things they wouldn’t have allowed someone else to do because, after all, they were on our side.”

Should publicly-shared information, be considered private? If users share personal information on a public forum, does this information still belong to the user, or does it belong to the place where it was shared? These questions, and many more, continue to be asked as the ethics of social media are debated. 

The history and use of social media is short, and very complex. One can’t help but find parallels to the first miscommunicated message in 1969, to the millions of miscommunications found in social today, from fake news to false perceptions (whether intentional or not). Though the concept of inter-connectivity is a noble one, the question of its effect and the way by which we are connected, who owns the information we share, and how that information should be used, are questions that are still being answered. The history of social media is still being written. Only time will tell if the its use and application find itself on the good side of history, or the bad.