How Far We’ve Come (and How Far We May Go)

[article]
Summary:

At the start of a new year, Michele Sliger looks back across the recent decades of information technology advancement—from the dawn of the personal computer to the abundance of social networking websites—and (with some pointers from Ron Jeffries and Linda Rising) ponders how those advances have impacted our view of change, software, and ourselves.

It’s that time of year again, when we look back on the past and reminisce. Being that I’ve crossed the midpoint of the age scale, I marvel at how far we’ve come in our technology and communications. I find the thread that connects the years is that of faster and more individualized access to information. And, it reminds me that IT—information technology—really is about the speed of access to information and not just how we refer to the computer operations department.

Information Technology from 1977 to 2000
The information access wave started in 1977 with the invention of personal computers. The TRS-80 and the Apple II were released in 1977, followed by the IBM PC in 1981 and the Commodore 64 in 1982. Everyone suddenly had computing power on their desks, rather than having to schlep down to the computer room and schedule time on the mainframe.

With the advent of PCs, there naturally followed ways to connect them within a workplace. SMTP was established in 1982, and networks exploded in the business world in 1983. This was also the start of the instant communications wave, as the first cell phone, the Motorola DynaTAC, was released in 1983 followed by POP and cc:Mail in 1984. Email and cell phones were the technologies that made each of us available for communication 24/7.

And, let’s not forget that in 1980 CNN went on the air. It took a few years before all the satellite and cable providers picked it up (and a few years before people decided they wanted to pay to watch TV), but when it took off in the mid-1980s, it made its own contribution to instant information access by being the first channel to provide news coverage twenty-four hours a day. No longer did the masses have to wait for the 6 p.m. and 11 p.m. broadcasts.

By the late 1980s, most of us had access to PCs, networks, email, cell phones, and satellite or cable TV. But, the biggest wave, the Internet, didn’t start to build until the early 1990s. The World Wide Web project was established in 1990, but browser Mosaic 1.0 didn’t come out until 1993—at which time there were a total of about fifty websites available.

By the late 1990s, most of us owned a PC or laptop, a mobile digital 2G phone with SMS texting, an Internet-provided email address (aol.com, hotmail.com, yahoo.com), and we had more than one hundred channels on TV. Pop culture absorbed and reflected these life-changing adoptions by giving us electronic music, cyberpunk science fiction novels, and movies reflecting first our fears (Terminator in 1984) and then our loving embrace of technology (You’ve Got Mail in 1999). Our access to information became pretty darn speedy.

The Turn of the Century
In the past decade, we’ve refined our creations. You can surf the Internet using a half-dozen different browsers. You can do your surfing using your own 4G smartphone, tablet, or lightweight electronic notebook. Google, the company that became a verb indicating an information search, leads the way in providing near-instant information access. Heck, thanks to the Internet, you don’t even need a TV to watch “TV” anymore.

Today, you probably have more than one email address, more than one phone number, more than one personal content offering (Facebook, Twitter, your own website or blog), and one or more software tools to manage them all. Some people feel overwhelmed at all they have to keep up with, some haven’t even bothered trying, and some have known nothing else.

Linda Rising, the coauthor of Fearless Change who is also well known for her pattern work, says:

Most information “out there” is less than fifteen years old, and in fast moving scientific fields, information doubles every three years. No one can keep up. The youngsters nowadays may seem to know a lot more than us old farts, but most of them don’t know their history. I have sat through many technical presentations on the latest and greatest and thought to myself, "We had this discussion years ago. It was a different programming language and a different development environment, but the issues were the same. Why haven't we learned this once and for all?"

While the technology continues to change, the problems we face continue to be the same, decade after decade.What's It All Mean to the Software Development Industry?
It’s clear that the pace of information exchange was slower in the twentieth century, save for the technology and communication explosion in the 1990s. Before this instant access to information, software development teams had the luxury of time—time to analyze, design, code, test, and deploy the product before the customers could change their minds. In other words, the application or system would go live before customers could gain access to information that might affect their original vision or needs.

This is one reason why agile has become so popular. In an era in which things change so quickly that we struggle to keep up, agile provides us with frameworks that allow us to flow with the changes, deliver product increments quickly, and become as fast moving as the environment around us.

Ron Jeffries, coauthor of Extreme Programming Installed and one of the founders of XP, sees another reason for agile’s growth: 

Over time, I learned that most of the interesting problems in our work are people problems, not just technical problems, and I became more and more interested in the bigger picture of how to get things done. That made me an easy mark for “agile,” which focuses strongly on the individual and the team, and for Extreme Programming, with its great emphasis on technical skill.

In addition to agile’s providing a framework for handling change and focusing on individuals and teams, there is a third driver, honesty, that is responsible for agile’s growth, as Linda Rising points out:

I'm not sure up-front planning ever worked. I just think it was easier in the olden days to pretend that it worked, so we all held hands and danced together. We all knew that it was a lie, but it felt good to sign those contracts and believe that the intended functionality would appear by the specified date.

Nowadays, when customers don't know what they want, it's harder to do that dance. Many times, customers are completely honest: I don't know what I want, but I want it by June! And we, of course, say, "We can do that!" What's to plan? When no one knows—not the customer, not the marketplace, not the researchers, no one—then our fallback position has to be that we grow it together. That's a dance that can take place with integrity. Instead of pretense, we have honesty. I believe that's what agile allows us to do: admit that we can't plan it out—that we never could—and move forward together.

The Future
If we look to the past and use that information to try to predict the future, it can look a bit daunting: great strides in technology, yet the same issues we’ve always faced (and at a faster pace).

Personally, I’m hoping I get to see the quantum computing and nanotechnology wave start to break in the next ten to twenty years, leaving the youth of today to begin tomorrow’s IT history article with “Do you remember when computers were made with integrated circuits?” And, with continued persistence from the agile community, I also hope to see “Do you remember when software developers built systems without involving the customer?”

What do you think? Please share your thoughts on how far we’ve come and where we might end up.

About the author

StickyMinds is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.