
JUMP TO

TRANSLATE

WATCH THE VIDEO


OTHER VERSIONS OF THIS READING
INTRODUCTION
For most of American history, major changes in daily life came from new machines like the railroad, electricity, cars, and airplanes. In the past 30 years, one invention has been just as powerful, even though you cannot see it like a highway or power line. The internet has reshaped the economy, entertainment, and politics. It has changed how people work, how families communicate, and how students learn.
In many ways, the internet has expanded opportunities. It makes information easier to access. It allows people to learn new skills, find jobs, build communities, and take part in public debates. Creating and sharing ideas is faster than ever before.
But the internet has also created serious problems. The internet means misinformation can quickly and private information can get exposed. The way the internet works means that outrage if often rewarded more than accuracy. Some communities have been left behind without internet access. So, some historians see the internet as the greatest expansion of knowledge in history, while others worry it may be weakening trust in democracy.
Historians in the future will likely debate the same question we are asking now: Has the internet made America a better place?
THE ORIGINS OF THE INTERNET
The internet did not begin as a shopping site or social media platform. It grew out of Cold War fears about communication and national security. During the early Cold War, American leaders worried about what would happen if a nuclear attack destroyed major command centers and killed key leaders. Could the country still send messages and coordinate a response?
To solve this problem, researchers developed systems that could keep working even if parts were damaged. In the late 1960s, government and military funding led to ARPANET, an early computer network connecting universities and research centers. In 1969, a computer at UCLA sent a message to the Stanford Research Institute. The message was “LO,” short for “login.” That moment is often seen as the first step toward the modern internet and “LO” is remembered as the first internet message ever sent.
ARPANET used an important idea called packet switching. Instead of sending information in one long stream, it broke data into small pieces called packets. Each packet could travel a different path to its destination. If one path failed, the information could still get through another way. This design made the system flexible and strong, like a spider web instead of a single road.
Scientists also wanted faster ways to share research. Before networks, they had to wait for journals and books to be published. By linking computers through phone lines, they could share data more quickly. To make different networks communicate, they created shared rules called protocols. The most important was TCP/IP. When many networks adopted these rules, they formed one larger “network of networks.” In 1983, TCP/IP became the standard for ARPANET, and many historians mark that moment as the true beginning of the modern internet.
So, we can see that the internet developed slowly over time. Its creators were not thinking that they were creating a new world of smartphones, apps, or streaming video. Those innovations came later as people found new uses for the system.
THE INTERNET & THE INFORMATION AGE
As the internet grew, the American economy was changing. By the late 1900s, many people described a new era called the Information Age. In earlier periods like the Industrial Revolution, economic power came from factories, railroads, oil, and machines. In the Information Age, power increasingly comes from data, communication, and knowledge.
The internet fits perfectly into this shift. If information is power, speed and connectivity became valuable resources. The internet, then is the tool that gives people the ability to move information instantly across the world.
This does not mean factories have stopped mattering. Americans still need food, housing, and the things made in factories. But globalization has reshaped the economy. As you learned in the last unit, companies moved manufacturing overseas where labor was cheaper in the 1970s and 1980s. Meanwhile, jobs in technology, finance, education, and media expanded. So in the Information Age, the fast-growing industries focused on creating, organizing, and selling information. The jobs of the Information Age are the designers and planners of products, not the manufacturers.
Education also became more important. In the early 1900s, high school became essential for industrial jobs. After World War II, the GI Bill helped make college normal for regular people. Today, as jobs rely more on technology and specialized skills, training beyond high school is often necessary to work in the information sector.
HOME COMPUTERS & THE EARLY INTERNET
In the 1960s and 1970s, computers were massive machines used by universities, the military, and large corporations. They could fill up entire rooms and most Americans never used one directly. That changed in the late 1970s and 1980s when personal computers became smaller and more affordable. Companies like Apple, IBM, and Commodore began selling computers for homes and small businesses.
A major turning point came in 1984 when Steve Jobs, CEO of Apple, introduced the Macintosh. Unlike earlier computers that required typed commands, the Macintosh used a graphical user interface (GUI) with windows, icons, and a mouse. This design made computers easier and less intimidating for everyday users.
Many Americans first encountered computers at work or school. Offices used them for payroll and documents. Schools created computer labs where students learned typing and basic programming. This exposure made computers feel normal and useful.
Primary Source: Advertisement
An ad for the first Macintosh emphasizing its user-friendly features, mouse, and graphical user interface featuring windows and icons.
At first, home computers were not connected to the internet. That changed in the 1990s with the creation of the World Wide Web (www). The internet is the infrastructure connecting computers, but the Web is a system of linked pages that made the internet easy to use. We don’t actually see the internet, but we do see the Web.
The Web introduced the idea of hyperlinks in which a user can simply click something to jump to another webpage. This made it easy to use. With the idea of the websites came the need for web browsers, which are the programs people use to see and explore pages. Early browsers like Mosaic and Netscape made “surfing the web” popular. Later, Chrome, Firefox, and Safari became the most used web browsers.
Primary Source: Movie Poster
The movie You’ve Got Mail captured the spirit of the early internet when people were excited about email and the possibilities the internet promised.
The Domain Name System (DNS) also made the internet easier. Websites are identified in the web with numbers, like a street number or an apartment number, but these are hard for people to remember. Instead of typing long number addresses, users type simple names (like inquiryhistory.com) and the DNS system converts those names into the numbers the Web needs to find and show each website.
In the 1990s, many families connected to the internet using dial-up through phone lines. Services like America Online (AOL) helped beginners get online by providing a connection and simple software. Email became one of the first popular uses. Even though dial-up was slow and tied up the phone line, it felt revolutionary.
The excitement of the era led to the Dot-Com Bubble. Investors poured money into internet start-ups, believing everything would soon move online and every new internet company would soon be huge. Many companies had little profit but high stock values. When many failed in 2000–2001, the stock market fell sharply in the dot-com crash. The crash showed how much people were excited about the internet, but also that the new digital economy had risks just like the regular economy.
One company that survived was Google. In 1998, Stanford students Larry Page and Sergey Brin created a better search engine that ranked websites based on links and relevance. Google quickly became the most trusted way to find information. Soon, “to google” became a common phrase that everyone knew how to use.
The next major shift was broadband. Unlike dial-up, broadband was fast and always on. No longer did people use the internet in short sessions of a few minutes to an hour. With broadband connections, the internet became a constant part of daily life. Broadband opened the door to online shopping, streaming, and online gaming. Later, smartphones made the internet even more central to everyday life.
Primary Source: Screenshot
The Amazon website as it appeared in 2000 after the company had already begun branching out and selling more than just books.
THE GROWTH OF E-BUSINESS
As broadband spread and the economy focused more on data and information, the internet created new ways to buy and sell goods and services. Online shopping, known as e-commerce, grew quickly. Amazon, founded in 1994 as an online bookstore, expanded into a massive marketplace that sells almost anything. Fast shipping and easy price comparisons changed how Americans shop.
As e-commerce expanded, many traditional stores struggled. For decades, Americans shopped at department stores like Sears, JCPenney, K-Mart, and Macy’s, often spending weekends at shopping malls. In the 2000s and 2010s, many of these stores lost customers as online shopping became faster and cheaper. Shopping malls across the country closed. The stores that survived usually added online ordering to compete.
The internet also transformed Business-to-business (B2B) sales. Instead of visiting supply stores, companies now order equipment and materials online in bulk. This shift has made purchasing faster and more automated.
At the same time, the internet lowered barriers for small entrepreneurs. People can sell used goods on eBay. Artists and small businesses can sell products online without opening physical stores. Small, temporary stores like food trucks and craft vendors at weekend markets can accept credit cards. Digital payment tools like PayPal and Venmo made sending money easy. Writing paper checks has become far less common.
Government services have also moved online. Citizens can renew licenses, file taxes, complete forms, and access public records through websites.
Primary Source: Photograph
Apps like Uber have created new opportunities for workers who want to participate in the gig economy.
Another major change is the gig economy. Thousands of Americans now do short-term work arranged through apps. Companies like Uber, Lyft, and DoorDash offer flexible work schedules. For some people, gig work provides extra income. For others, it replaces traditional jobs. However, critics argue that gig workers often lack benefits and job security. Some states have passed laws to regulate these companies, showing that this new economy raises important legal questions.
ENTERTAINMENT GOES DIGITAL
Just as movies replaced vaudeville in the early 1900s, the Information Age reshaped entertainment in the 2000s.
Video games began in arcades, where players inserted coins to play short games. In the late 1970s and 1980s, home consoles like Atari brought gaming into living rooms. Later, companies like Nintendo and Sega made video games a major part of youth culture.
As computers improved and internet speeds increased, online gaming became common. Players could compete or cooperate with others around the world. Handheld devices and smartphone games expanded gaming even further.
Primary Source: Photograph
Handheld games like this Leapster have replaced arcades and opened up a whole new category of entertainment.
Movies and television also changed dramatically. In the past, families watched shows on TV at scheduled times or rented VHS tapes and DVDs from stores. The rise of streaming allows viewers to watch instantly over the internet. Netflix is the clearest example of this shift. It began as a DVD-by-mail company but later focused on streaming. As streaming grew, rental chains like Blockbuster declined and eventually went bankrupt.
The internet also changed who could create entertainment. YouTube allowed individuals to upload videos and build audiences without a television studio.
CELL PHONES TO SMARTPHONES
The invention of the cell phone added another layer to the digital revolution.
Although the idea of mobile communication had been around for many years, modern cell networks were first built in the late 1970s and early 1980s. Early cell phones were large and expensive, often used in cars because their batteries were so big. Over time, phones became smaller and more affordable. By the early 2000s, cell phone stores were everywhere and owning a cell phone was common in the United States.
The biggest transformation came in 2007 when Apple released the iPhone, the first widely successful touchscreen smartphone. Soon after, Google and Samsung introduced similar devices. In 2008, Apple launched the App Store. By the early 2010s, 4G networks made smartphones fast enough for video streaming, constant messaging, and social media.
Smartphones turned the internet into something people carried everywhere. Instead of using a few programs on a desktop computer, Americans began relying on dozens or even hundreds of apps. People use apps to order food, track fitness, get directions, manage money, and communicate.
Smartphones also changed social norms. Texting replaced many phone calls. Emojis added new ways to express emotion in short messages. Friends look up information during conversations. GPS replaced paper maps. Many workplaces began expecting employees to respond to emails outside work hours.
At the same time, smartphones raised concerns. Constant notifications can weaken attention and interrupt sleep. Some people worry that life has become more focused on posting and documenting experiences rather than living them. These debates continue as society adjusts to life with a constant digital connection.
Primary Source: Photograph
Apple’s CEO Steve Jobs introducing the first iPhone on stage in 2007.
The early internet was like a library: most people only read content created by a few authors. Then came Web 2.0. This change was not one single moment, but a shift in the way the internet worked. With the Web 2.0 shift, websites added features so that people could post their own photos, videos, and ideas online, making the internet more interactive and social.
Different platforms made this possible over time. Chat rooms, forums, and comment sections appeared first. Then social media sites like MySpace (2003) and Facebook (2004) allowed users to create profiles and share updates. Later, YouTube (2005), Twitter/X (2006), Instagram (2010), Snapchat (2011), and TikTok (2017 in the US) added new ways to share content, especially video.
Smartphones were key to this shift. Once phones had cameras and fast internet, people could upload instantly. Social media became part of everyday life, not just something done on a home computer.
WEB 2.0 & SOCIAL MEDIA
The early internet was similar to a library, with thousands of webpages that people could log on and read. But like a library, where only a few authors write books compared to the manere. Once phones had cameras and high-speed connections, people could upload photos and videos instantly. Social media became a constant part of daily life, not something you did only on a home computer.
Primary Source: Screenshot
A sample of the social media options available in 2000.
Social media helped people form communities and spread ideas quickly. It influenced politics and social movements. Internationally, the Arab Spring in 2011 showed how activists could use social media to organize protests. In America, movements like #BlackLivesMatter used social media to share videos, organize events, and gain media attention.
A new kind of celebrity also emerged: influencers, who build audiences online and earn money through ads and sponsorships. They differ from traditional celebrities because they often directly interact with their fans and create content without studios.
Smartphones and social media also created a new phase in the internet’s history: the platform era. In the beginning, everyone wrote content for web pages. With the platform era, information is saved in databases and apps and websites control how we see it. So, a user can view the same email on a smartphone or computer or look up information about their bank account on a website or iPad. The information is the same, but the presentation changes based on the platform.
DANGERS & CHALLENGES OF THE INFORMATION AGE
The internet brought many benefits but also new risks. Each wave of development, from early web pages to broadband, smartphones, social media, and AI has made life faster and more connected, but it also amplified problems.
These risks matter because the internet is not some magical “cloud” in the sky. It depends on real infrastructure (computers, buildings, wires, etc.) and on decisions about who owns and controls it. Most global internet messages passes through undersea fiber-optic cables, and on countless data centers that store and deliver the videos, apps, and websites we use every day. When these systems are controlled by only a handful of companies, questions about competition, security, and political power cannot be ignored.
New technology has always introduced unexpected risks. In 1999, the world faced the Y2K scare. Early computer programs often stored years with only two digits, so “1999” became “99.” When the year 2000 arrived, computers could misinterpret “00” as 1900, potentially causing failures in banking, transportation, and government systems. A disaster was avoided because governments and companies spent billions of dollars fixing their software, but the scare highlighted how much modern life depended on computers by 2000.
Other threats proved more serious. Hacking, for example, can compromise sensitive data or seize control of systems. In 2017, the Equifax breach exposed personal information of millions of Americans, demonstrating how identity theft becomes easier when large companies fail to protect data.
Ransomware attacks pose another danger, locking files and shutting down key services. The 2021 Colonial Pipeline attack disrupted a major oil pipeline, showing how cybercrime can have national consequences. While these attacks make headlines, everyday, ordinary phishing scams trick people to give up their bank account information or other private data.
Primary Source: Screenshot
A screenshot of the WannaCry ransomware attack that spread through Microsoft Windows computers in 2017. The malicious app encrypts data until users paid a ransom in bitcoin.
Cyberwar is another concern. Nations can use hacking to interfere with rivals, targeting infrastructure, military systems, or elections. For instance, Russia has been linked to attacks on Ukraine’s electrical grid, showing that digital conflict is not limited to isolated hacks but can affect essential services.
At the same time, the digital divide highlights inequality in the Information Age. Not all Americans have access to high-speed internet, modern devices, or strong digital skills, leaving some communities behind. Advocates argue that internet access should be considered a necessity, much like electricity during the 1930s and 1940s, when the Rural Electrification Administration worked to bring power to people outside of cities. Today, those without internet access are disadvantaged in education, work, and civic participation.
Privacy is another pressing challenge. Companies and governments collect huge amounts of data about people’s behavior, purchases, and locations. Some data collection is visible, such as targeted advertising, but much happens without us knowing. Cross-site tracking and data brokers can compile long-term records about people’s habits, creating a permanent digital footprint. A notable example was the Cambridge Analytica scandal, in which information from millions of Facebook users was used for political targeting without permission.
Government surveillance has also raised concerns. The USA PATRIOT Act, passed after September 11, expanded the government’s power to collect information in order to catch potential terrorists. In 2013, a government worker named Edward Snowden illegally released classified information about the government’s spying showing that even ordinary citizens were being watched even if they were not suspected of a crime. Americans were shocked and Snowden’s actions restarted a debate about how much privacy people were willing to give up to have safety.
Debates over net neutrality illustrate how control and fairness remain contested online. Net neutrality is the principle that internet service providers should treat all sites equally, rather than giving priority to those that can pay for faster delivery. Supporters argue this protects competition and ensures smaller websites can reach audiences. Opponents warn that strict regulation could discourage investment in broadband. The policy has shifted over time: the FCC enacted net neutrality rules in 2015, only for them to be repealed in 2017, leaving the issue unresolved and raising questions about fairness and access.
Many of the internet’s challenges are linked to money and the attention economy. Online platforms rely on advertising, often sold through automated auctions that reward detailed tracking and personalization. Algorithms on social media, featuring infinite scrolls and constant notifications, are designed to keep users engaged, sometimes by promoting extreme or emotion content.
This has led to a variety of problems. Conspiracy theories spread because they are interesting, so people click on them. A result is less trust in government leaders. The ways Americans get their news have also changed. Instead of reading a few local newspapers or watching the local TV news, people get information through many different feeds, influencers, and algorithm-driven platforms. Local journalism has suffered, while misinformation can more easily spread, sometimes with real-world consequences. For example, in 2016, Russian agents spread disinformation on social media about the American presidential candidates. During the COVID-19 pandemic, false claims about vaccines and treatments influence public health decisions.
Primary Source: Screenshot
In 2026, New Jersey became one of many states to pass laws trying to restrict students’ use of cell phones at school.
Social media’s influence also raises questions about mental health. Teenagers and adults can both experience anxiety, fear of missing out, or distorted views of reality by comparing their lives to what they see on social media. Debates over smartphone use in schools have grown. Some argue phones distract students, increase cheating, and reduce focus, while others point out that devices can support learning, translation, safety, and communication. Enforcing rules around phones can be challenging.
Finally, artificial intelligence (AI) presents a new frontier of both opportunity and risk. Tools like ChatGPT, released in 2022, can assist with writing, translation, tutoring, and work. But AI can also produce realistic fake images, videos, and audio, making it harder to tell the difference between fact and fiction. This challenges not only personal privacy and consent but also public trust. If society cannot agree on what is real, how will voters be able to make wise decisions? Our democracy itself may be affected.
Taken together, these challenges show a recurring pattern: as the internet has become more central to daily life, it has offered unprecedented opportunities while at the same time creating risks older generations never faced.
THE WORLD GOES ONLINE DURING COVID
The COVID-19 pandemic made the promises and perils of the Information Age unmistakably clear. In 2020, Americans of all ages were pushed deeper into the digital world. Lockdowns forced work, school, healthcare, and even religious services online. Students attended virtual classes, submitted assignments digitally, and joined video meetings for the first time. Many older adults who had only used phones for voice calls suddenly had to choose between learning to use technology or being totally alone.
This digital shift brought opportunities but also exposed problems. Some students lacked reliable devices or internet, while many faced challenges with motivation, social isolation, and mental health. In many communities, access to the internet went from being a convenience to a requirement for daily life, making pre-existing inequalities more obvious. Schools had to figure out how to make sure all students had the technology they needed to participate in school from home.
Primary Source: Screenshot
The “I am not a Cat” viral video of a lawyer who couldn’t figure out who to turn off his cat filter on Zoom during an online court session was a humorous example of people trying to adjust to working entirely online during the pandemic.
Platforms like Zoom and FaceTime became essential for school, work, and family communication. Remote work became widespread, and hybrid work arrangements in which people spend some days in an office and some days working from home remain common today. Telehealth expanded rapidly, allowing patients to consult doctors online, including for mental health services. The pandemic highlighted the power of digital tools but also the risks and gaps inherent in relying on them.
CONCLUSION
The internet has reshaped the economy, entertainment, and communication. It has opened doors to learning, creativity, and connection, but it has also introduced new problems: privacy concerns, misinformation, cybercrime, mental health pressures, and inequality. Like the railroad in the 1800s or television in the 1950s, the internet has fundamentally altered American life. Yet unlike those earlier technologies, it allows individuals to influence millions instantly.
As historians look back over the past 30 years, they may argue about whether smartphones and the internet brought more benefits than costs. What do you think? Has the internet made America a better place?
CONTINUE READING

SUMMARY
Big Idea: The internet has transformed American life by making information, communication, and commerce faster and cheaper, but it has also created new challenges around privacy, inequality, misinformation, and power.
The internet began as a government-funded Cold War project and coincided with the dawn of the Information Age, with the economy shifting away from factories and toward data, communication, and knowledge. Personal computers and the World Wide Web brought digital technology into homes, schools, and offices.
The internet revolutionized the economy and entertainment. It changed how Americans buy and sell as online sales pushed brick-and-mortar retailers out of business. At the same time, the internet lowered barriers for small entrepreneurs using sites like eBay and online payment tools. It also made gaming, streaming, and on-demand entertainment normal, as power shifted away from traditional TV and video rental stores and toward online platforms and independent creators.
Smartphones put the internet in people’s pockets. Americans began relying on apps for daily life, changing social norms and raising new concerns about attention, sleep, and in-person relationships.
The internet shifted from mostly reading pages to creating and sharing content. Social media helped people build communities and organize, but it also concentrated influence in platforms and feeds that shape what people see.
As devices and networks became more capable, online life became more integrated into everyday life. The COVID-19 pandemic accelerated the integration of digital tools even further and showed both the benefits of the internet and the costs of unequal access, as the digital divide became a major obstacle for many families and communities. Cybercrime, privacy concerns, government surveillance, net neutrality debates, and misinformation all raise questions about who controls the internet and how society can protect trust and democracy.

VOCABULARY
![]()
PEOPLE AND GROUPS
Steve Jobs: Apple leader who helped launch the iPhone in 2007, accelerating the shift to smartphones as everyday internet devices.
Larry Page & Sergey Brin: Co-founded Google (1998), which popularized a more effective way to search the growing web.
Influencer: An online personality who builds a large audience on social media and earns money through advertising, sponsorships, or product promotion.
![]()
KEY IDEAS
Hacking: Breaking into computer systems, often to steal information, disrupt services, or take control.
Identity Theft: Stealing personal information to commit fraud, such as opening accounts in someone else’s name.
Ransomware: Malware that locks files or systems until a ransom is paid.
Phishing: A scam that tricks people into revealing passwords or personal information, often through fake emails or messages.
Cyberwar: Conflict between countries that involves hacking to disrupt infrastructure or weaken rivals.
Digital Divide: The gap between people who have reliable internet access, devices, and digital skills
Cross-Site Tracking: A method that allows companies to record user activity across websites and apps and compile profiles of what users buy, read, watch, etc.
Government Surveillance: Government monitoring or collection of people’s communications or data, often justified for security.
Net Neutrality: The principle that internet service providers (companies that provide internet access) should treat all online traffic equally and not favor certain websites or services.
Telehealth: Healthcare services delivered through the internet (such as video appointments), allowing patients to consult with doctors remotely.
![]()
EVENTS
Information Age: A modern era, beginning in the late 1900s in which economic and political power increasingly comes from data, communication, and knowledge.
Dot-Com Bubble: Rapid growth of internet-based companies and investment, followed by major market losses.
Y2K Scare: Late-1990s fear that computers would malfunction in 2000 because many programs stored years using only two digits.
Web 2.0: A phase of the internet when users increasingly created and shared content, not just read it.
Platform Era: A later phase of the internet when information is no longer written out into static webpages but is stored in databases and formatted into apps, feeds, and webpages that people interact with.
COVID-19 Pandemic: Public health crisis that pushed schools, work, healthcare, and services further online.
![]()
BUSINESS
Apple: Technology company that popularized the personal computer with the Mac and smartphones with the iPhone.
America Online (AOL): Company that helped many Americans get online in the 1990s through dial-up internet and an easy-to-use online service.
Google: Company that started as a search engine in the late 1990s and then expanded into many other internet products and services.
E-commerce: Online buying and selling (online shopping).
Business-to-Business (B2B): Online sales and services between companies (not directly to individual consumers).
eBay: Online marketplace that helped popularize person-to-person selling.
Gig Economy: Work built around short-term jobs, often found through apps and online platforms like Uber or DoorDash.
Netflix: Streaming platform that helped normalize on-demand TV and movie watching.
YouTube: Video platform where people can post videos and build audiences.
Facebook: Social media platform launched in 2004 that became a dominant network for online communication.
Zoom: Video meeting platform widely used for school, work, and communication.
Data Brokers: Companies that buy and sell personal data, often collected from many sources.
Attention Economy: A system in which digital platforms compete for user attention in order to sell advertising, often using algorithms and design features to keep people engaged as long as possible.
Remote Work: Working from home (or outside a traditional workplace) using the internet for communication, meetings, and job tasks.
Hybrid Work: A work arrangement that combines remote work with in-person work, such as coming into an office on some days but not others.
![]()
TECHNOLOGY
Internet: A global “network of networks” that connects computers and allows information to move between them.
ARPANET: An early computer network that connected universities and research centers and helped lay the groundwork for today’s internet. It was funded in part by the government because of Cold War fears.
Personal Computer: Smaller, affordable computer designed for home and small business use. These became common in the late 1970s and 1980s.
Macintosh: Apple’s 1984 personal computer that helped popularize a graphical user interface for everyday users.
Graphical User Interface (GUI): A computer interface that uses windows, icons, and menus instead of typed commands. This design made computers much easier for the average person to use.
World Wide Web (www): A system of web pages and hyperlinks that runs on top of the internet and is accessed through a browser.
Web Browser: Software that lets users view and navigate web pages. Examples include Chrome and Safari.
Domain Name System (DNS): The system that translates website names into the number-based addresses computers use.
Dial-Up: An early form of home internet connection that used telephone lines, which was slow and tied up the phone line while connected.
Broadband: A faster, always-on internet connection that replaced dial-up and made it easier to stream, shop online, and stay connected throughout the day.
Search Engine: A tool that helps users find information online by searching the web for keywords. Google became the dominant option by the early 2000s.
Streaming: Watching or listening to media online without downloading the entire file first.
Cell Phone: Mobile phone that connects through networks of towers (“cells”).
Cell Network: System of cell towers that allow mobile phones to communicate wirelessly.
iPhone: Apple’s smartphone (first released in 2007) that popularized the modern touchscreen smartphone and helped expand the app economy.
Smartphone: Cell phone that functions like a small computer with internet access and apps.
Social Media: Websites and apps where users create profiles, share content, and interact with others. Facebook, Instagram and Snapchat are examples.
Data Centers: Large warehouses filled with servers that store and process online information.
Cloud Computing: Using remote servers (often in data centers) to store, run, and deliver apps and data over the internet.
Artificial Intelligence (AI): Computer systems that can imitate tasks associated with human intelligence, such as generating text, images, or audio.
