The Silicon Valley internet boom, for some of us who were around during the last start-up scramble, looked like deja vu but with a difference. There were the same youthful techies in the same one-product shops, working the same goldminer hours, with the same visions of innovation and bonanzas. Hair was shorter this time, and the bangles and wild color schemes were on the bodies rather than on the clothes. But the cult of the new was the same.
So were a lot of other things. Java was new, but not OOP programming structures-those came in with C++ in the 80s. Relational databases were slurped up practically unchanged; really big data still can't go to ground as Web-based content. Management and process strategies for high-tech development never change. Read the old masters like Gerry Weinberg, Tom DeMarco, Ed Yourden, Grady Booch, and you'll see today's companies are still rerunning the same messes and successes.
Software test is another technology the 90s didn't make obsolete. Testing techniques that worked for client-server didn't suddenly grow irrelevant when the new economy added an extra tier. However, the dot-comites didn't see this carryover. They wanted their QA to be new too. So, many of their test balloons floated off with only new people at the helm. There were many crashes, and many more preventable bumpy rides.
Let me remind you about new technology in the 80s. This was the beginning of mass-market software and hardware, a whole mini IT shop on a PC for each customer, a whole new world that the lore of the mainframe didn't fit. Our desktops now are measured in megabytes, but they were in kilobytes then, and stand-alone, until LANs started to connect them. PC developers were largely self-taught, as mainframe developers could never be, though many of the first PC programmers had professional mainframe experience. Test for the PC was greener, with little influx from mainframe QA groups; our shrink-wrap ways weren't what they were used to. ("Agile development" isn't new either!)
So we reinvented software test for the PC-and reinventing it then made sense. A menu-driven desktop app can be ad hoc tested cheaply and effectively. We made some attempts at automation, but the tools were crude, and we couldn't keep up with changing menus. As it turned out, our apps lived only two to three years before being coasted-too short a time for regression suites or documentation to be missed. Habitual test planning didn't happen till the late 80s, when OOP and its multiplex apps pushed us out of the oral tradition.
So when client-server came in to make things even bigger, we were ready, we just added networks to the complex OOP testing we already knew. Connectivity became an issue, but not too quickly. At first the UI and business logic stayed on the (sometimes-layered) desktop, while only the data migrated server-side. Multi-user and data refresh were new only to the desktop. Relational databases had handled both from mainframe days.
Thus, software test at the start of e-Business technology. The Net and the Web widened the world. Business logic and network functionality were modularized on an array of differently configured servers, leaving little on the desktop besides a browser-rendered UI and the odd applet. Load, connectivity, JVMs, and mean-time-to-failure acquired new importance for test, though maybe not as much for those of us who tested on WANs with DSP and wireless. But as with client-server, NewTech testing was a manageable next step from what we'd already done. Or so it could've been. Instead there was a disconnect. Internet development was spearheaded by a new generation,