We were wrong. A lot.
I reached that simple conclusion after reading hundreds of articles and dozens of back issues of PC Magazine. We rarely spotted the trends before they happened, and we were sometimes openly hostile to new and innovative ideas. Fortunately, we were never wrong for dull reasons. And from 1982 to this moment, PCMag has witnessed and chronicled so many rises and falls that our oeuvre resembles a cultural seismogram.
Within those dips and valleys, over the decades, a few products appear again and again. They show the course of our industry as well as of our publication. This is 40 years of PCMag in six shortish stories about key subjects we returned to often: the Apple iMac, the game Myst, the IBM (later Lenovo) ThinkPad, the Apple iPhone, Google’s Android OS, and the singular, transformational technology behind everything: the internet.
(My colleague Sascha Segan feels we were right more often than we were wrong, but also found five big things we were wrong about, so be sure to check out his viewpoint as well.)
The iMac: Steve Jobs Reimagines the Desktop PC
The first use of the term “iMac” in PC Magazine was in our June 30, 1992(Opens in a new window) issue: It was a low-cost ISDN adapter called PC IMAC, and it was magnitudes less interesting than the iconic, colorful, all-in-one PC that Apple announced in the summer of 1998.
Our first evaluation of the Apple iMac ran in the Oct. 20, 1998(Opens in a new window) issue. Writer Tom Pope praised its simplicity but echoed most of the criticisms of contemporary commentators. Pope called the lack of a floppy disk drive “loopy” and wasn’t a fan of the notorious hockey puck mouse. Presaging many future complaints with Apple, Pope noted that upgrading the iMac was possible but limited and difficult. He also took issue with Apple’s decision to go all-in with USB. While it’s now ubiquitous, this was USB’s first appearance on a Mac, and few peripherals were available.
“We found the iMac visually appealing and enjoyable to use. Apple’s claims of Pentium-killing power are clearly overstated, but the iMac still succeeds extremely well at being what it was intended to be: a fast, easy-to-use machine that will remain viable long after the warranty runs out.”
Most of Pope’s review was devoted to debunking Apple’s claim that the iMac was “three times faster” than comparable products, which was not borne out in our testing at the time. This would ignite something of a firestorm, resulting in PC Magazine retesting after 300 angry reader replies and another evaluation of the iMac in the Jan. 5, 1999(Opens in a new window) issue. Perhaps sensitive to reader response, writers Pope and Nick Stam go out of their way to say how much they liked it when they first reviewed it.
We even nominated it for a 1998 Technical Excellence award(Opens in a new window), which it lost to the Toshiba 800 laptop.
“One thing’s for sure: the Apple iMac is rarely met with indifference. … You have to admit that the latest successor to the famed and dabbled Apple Macintosh elicits strong options. It is unlike anything that came before it—and definitely moves things in a new direction.”
What’s most surprising about our earliest reviews of the iMac is how little attention we gave to the look of the thing. Built from luminous white and bright-blue plastic and resembling the nose of a space shuttle, the iMac looked like nothing else. Arguments over aesthetics were left to the opinion pages, and much ink was spilled.
In the July 1998(Opens in a new window) issue, Editor Jake Kirchner said contemporary PCs had, “all the fashion sense and design pizzazz of a hacksaw.” Yet Kirchner also dismissed the iMac as overpriced and underpowered: “It’s all about price and specs,” wrote Kirchner. “High-tech is hip enough to sell on its technical merits alone.”
December of 1998 saw two op-ed columns that included digs on the iMac. Senior Executive Editor Bill Howard called the iMac “cute” but warned Christmas shoppers(Opens in a new window): “By the way, you want a PC, not a Mac.” Contributing Editor Jim Seymour went further(Opens in a new window), boldly declaring, “If Apple’s cute iMac has become, to twist an earlier Macintosh line, ‘the computer for those of us who are embarrassed about having a computer in our homes,’ the SE7/T55A combination is ‘the computer for those of us who are proud to have a computer in our home.'”
Seymour would go on something of a journey in his relationship with the iMac. In February 1999(Opens in a new window), he said of the iMac, “I’m not as fond of it as some, but I’ll concede that its designers broke the stranglehold of the Beige Box Rule in PC design.” By March of 2000(Opens in a new window), Seymour still decried the original iMac as “wildly overrated” but was willing to concede that the current iMacs were “respectable machines.”
In that same column, Seymour rightly identifies the iMac as a “pivotal step in saving Apple,” which is an important part of the machine’s legacy. PC Magazine issues from 1999 to 2000 do capture the iMac’s short-term impact: an array of cheap, colorful imitators cluttered the pages. In our Oct. 5, 1999(Opens in a new window), issue, we praised the iMac cloner eMachines for beating the original on features and price.
The next incarnation of the iMac would be even more dramatic than the original: a flat screen floating on an adjustable silver neck attached to a white, domed base. This time, PC Magazine was more willing to accept the design: “Every few years, Apple debuts a computer that gets people talking in a way no Windows PC does,” wrote Editor-in-Chief Michael Miller in March 2002(Opens in a new window). “This year, it’s the new flat-panel iMac. I recently spent some time with a preproduction version and can report it’s great fun to use.”
Our reviewers were more critical of the “gooseneck” iMac, pointing out the awkward keyboard, lack of graphics upgrade options, and poor speakers. (“If you enjoy music, you won’t want to hear it on the iMac’s internal speaker.”) But this iMac earned a rating of four out five stars in our April 9, 2002(Opens in a new window), review, and in December of 2002(Opens in a new window), we gave the 17-inch version of the iMac a perfect five stars. Notably, that issue also featured an iMac attack ad(Opens in a new window) from Gateway.
“If the crowd in our labs is any indication, Apple has a success on its hands with the new Apple iMac. Even jaded veterans of the platform wars stopped by to admire the machine’s innovative design…”
If the reviewers of 1998 were mum about the iMac’s looks, PC Magazine in 2002 pushed its prose to questionable heights. One reviewer called the machine a “funky flower pot,” and another repeatedly waxed poetic about its “alabaster dome.” PC Magazine declared this in the Dec. 24, 2002(Opens in a new window), Technical Excellence award issue: “The iMac has one of the most amazing necks since the days of Audrey Hepburn.” The iMac lost that year to the NEC PowerMate Eco.
Imitation is the highest form of flattery, and so are attack ads.
“The latest incarnation of Apple’s design standard-bearer, the Apple iMac G5, is no less sublime, but a lot more subtle. The iMac G5 will have you nodding your head and saying, ‘It’s about time they designed a computer like this.'”
In 2004, with a perfect five-star rating from PCMag, the iMac became upright, transforming into the form factor it still has today. There were other changes, of course, including moving to Intel chips and ditching plastic for aluminum. For 15 years, the iMac would get faster, bigger, thinner, and briefly go Pro—but it was still a gray aluminum rectangle.
That changed in 2021, when the iMac returned to its colorful roots. We still disliked the mouse and the choice of ports, and we noted its lack of power compared with other Macs. Yet it’s telling that after so many years of iMac being a runner-up, this is the model that won a PCMag Technical Excellence award.
“The 24-inch iMac is a welcome tribute to the iMac’s origins. Color options are just as important now as they were in the late 1990s.”
Hindsight makes it easy to poke fun at PCMag’s struggle to make sense of the original Apple iMac, but its success was not a foregone conclusion. And in some ways, those reviewers were right: The iMac was underpowered and certainly didn’t live up to Apple’s marketing. And, yes, the mouse was terrible.
What PCMag missed with the iMac wasn’t a fundamental shift in design (the era of iMac knockoffs was short-lived) but a shift in what drove people to buy computers. Ease of use and attractive, eye-catching looks appealed to people who wanted to explore the nascent internet and benefit from what PCs had to offer. These weren’t ignorant customers being duped by slick marketing; they simply had different priorities, ones that PCMag didn’t see. And over the next two decades, these customers would define who consumer electronics were for.
Myst: Cyan Beguiles and Puzzles…Everyone
With digital downloads and streaming services competing for our attention, it’s easy to forget that there was a time when CD-ROMs were a radical new technology. They promised an end (mostly) to swapping through a half dozen diskettes, as well as a whole new world of multimedia experiences for the PCs now equipped with CD-ROM drives. Right at the beginning was Myst, a foundational work for gaming and an early demonstrator of what CD-ROMs and PCs could deliver.
Myst originally debuted for the Mac in 1993, but PCMag did not review it until its 1994 release on Windows. On the bottom of page 481 of the June 14, 1994 issue(Opens in a new window), next to ads for 90MHz Dell PCs, was our four-star review.
“Your new world is a series of bewildering enigmas that when added together become Myst.”
A puzzle game in which you traverse fantastical worlds through magical books, Myst was, nonetheless, literally static. You moved in-game by navigating among pre-rendered CG images. While simple, the game embraced a contemplative style.
“You can’t get killed, nor can you do anything that will automatically lose the game,” wrote Barry Brenesal. “And it’s just as well, because you may want to neglect the plot for a while and just admire Myst’s striking visuals.” And while Brenesal noted that the static nature of the game was frustrating, he praised the atmosphere created by the game’s “visual and aural spell.”
Myst’s impact went beyond entertainment. “Occasionally, there’s an interface that’s so seamlessly intertwined with the program that it’s beautiful,” wrote Editor Robin Raskin in 1994(Opens in a new window). “Take a look at Broderbund’s Myst for the ultimate example.”
The game made our list of the 100 Best CD-ROMs in September 1994(Opens in a new window) and was a finalist for a Technical Excellence award(Opens in a new window) that year. “Myst’s graphics and integrated sound have made it both popular and much-imitated,” we wrote, but we still gave the award to another foundational if wildly different game: Doom.
In 1995, a PCMag writer described Myst(Opens in a new window) as “Shakespearian,” and our 1997 15th-anniversary issue(Opens in a new window) deemed the game a modern classic. “The graphic adventure that shook the world,” wrote Associate Editor Michael Ryan. “Myst challenged the hard-core game players while helping to open the arcane world of computer gaming to the rest of us.”
In addition to acknowledging Myst as a cultural touchstone, we also used it as a graphical touchstone. Starting in 1995(Opens in a new window), Myst was one of the games we installed on PCs as part of our testing regimen.
Myst went on to spawn many sequels, and we reviewed them all. Riven: The Sequel to Myst, a masterpiece in its own right, earned four stars(Opens in a new window) in 1998; Myst III: Exile, which features a homicidal Brad Dourif, earned five stars(Opens in a new window) in 2001; Myst IV: Revelations took 4.5 stars(Opens in a new window) in 2005, and reviewer Tony Hoffman gave special attention to the Peter Gabriel soundtrack; also in 2005, we gave the final installment, Myst V: End of Ages, four stars(Opens in a new window). The unusual and controversial MMO (massively multiplayer online) game Uru: Ages Beyond Myst received a shoutout in 2003(Opens in a new window), but we declined to give it a rating.
In later years, Myst faded from public discourse, but it never went away. Partly driven by COVID-19 lockdown-induced boredom, I, your humble reporter, played through Myst on the Nintendo Switch during that first pandemic summer. I was so struck by the game that I not only reviewed it (a bit harshly because of some porting issues), but I also played all of its sequels in one marathon season. This was fortuitous, because Myst’s developer, Cyan, was preparing an entirely new dimension of Myst.
In late 2020, a revamped Myst debuted on the Oculus Quest VR headset, and shortly after, was followed by a high-definition version of Myst for PC with 2D and VR support. I sometimes stopped just to watch the water flow by, or to take in the heavy atmosphere of the Channelwood swamps, much like our 1994 reviewers did.
In an example of the cyclical nature of time, the 2020 Myst was used for PCMag testing—this time, to evaluate supersampling for different graphics cards.
I marveled at the sparkling blue cave within the fore-chamber and felt compelled to bend over to inspect a particularly interesting rock. I actually gasped at the whorls of the wood grain in the library. The quiet, windswept melancholy of the Selenitic Age made me a bit giddy, particularly as I descended an especially long ladder into a cave. Myst has always rewarded exploration, and in such a lushly detailed world even longtime fans will find plenty to investigate.
PCMag has always covered PC games (there’s a full-page ad for Zork(Opens in a new window) in one of our earliest issues) but Myst was different. This was a game of such wild popularity that its sales competed with antivirus products and even the Microsoft Windows 95 upgrade(Opens in a new window). It wasn’t the first PC game, but it was certainly one of the few that transcended its genre, rising to the realm of talked-about entertainment. PCs, Myst seemed to argue, could transport us as well as any magical books.
The ThinkPad: IBM Redefines the Laptop
When I started researching this story, there were two products I immediately knew I had to include: one was Myst, and the other was the ThinkPad. We often think of Apple as having iconic design, but the iMacs of 2022 bear little resemblance to their 1998 ancestors. The ThinkPad, on the other hand, has changed dramatically but still looks like its earliest progenitors. Perennially associated with business, the ThinkPad nonetheless has a well-earned reputation for quality that PCMag has tracked for decades.
It did not start out that way. “To those familiar with IBM’s checkered past in the portable market, this notebook is a bold step and a great success,” wrote Mathew J. Ross in PCMag’s Dec. 22, 1992(Opens in a new window) review of the ThinkPad 700C. “The impressive design work inside the ThinkPad may have you wondering who is actually responsible for this impressive…color notebook.”
One critical problem with early laptops: What do you use to replace a mouse? Trackpads were still a pipe dream, and rollerballs were far from standard issue. IBM’s solution was the TrackPoint II, which Ross called “the most unobtrusive and user-friendly pointing device in the notebook industry.”
The red TrackPoint nubbin still graces ThinkPad keyboards 30 years after its introduction. Functioning a bit like a tiny, flexible joystick, the TrackPoint lets you maneuver the cursor without taking your hands off the keyboard. As Ross pointed out, it’s also comfortable for both right- and left-handed people.
“After years of designing undistinguished portables, IBM has finally gotten it right”
The same issue of PC Magazine also contained a curious reference(Opens in a new window) to “IBM’s monochrome pen-based ThinkPad.” That’s clearly not the 700c. It turns out the first ThinkPad was an actual pad: a tablet PC that would eventually be branded as the 700T. That original ThinkPad(Opens in a new window) boasted a 10-inch screen with 4MB or 8MB of RAM and started at $5,000 in 1992 dollars. We took a closer look at the machine in June 1992(Opens in a new window), noting its rugged design and solid-state storage—a whole 20MB of it.
This odd, almost forgotten piece of ThinkPad history foreshadows daring innovations from the aggressively plain business laptop. There’s perhaps no better example than the ThinkPad 701C, a bland name for a laptop that featured a literal moving keyboard. Until gaming laptops made it once again acceptable to market a computer roughly the cost and weight of a Honda Civic, designers had to balance a notebook’s portability against its size. This led to cramped, awkward keyboards for laptops.
The 701C was different, as Senior Editor Brian Nadel wrote in the March 14, 1995(Opens in a new window) issue:
“When the case is closed, the keyboard occupies the notebook’s entire footprint, but the key’s rows are offset. When the lid is raised, a spiral cam in the left hinge starts the mechanical magic by moving a series of levers and arms that slide the left half of the keyboard over to the left and the right half down and to the right, like two pieces of jigsaw puzzle forming a whole.”
“The engineers at IBM have gracefully packed so much potential into such a small package that the ThinkPad 701C sets the new notebook standard for others to match.”
Despite its solid (if stodgy) reputation, the ThinkPad series had a brush with death shortly after our glowing five-star review(Opens in a new window) of the X40. “The news that IBM is selling its PC business has generated a lot of backward-looking, warm and fuzzy feelings for a company that doesn’t really have such a warm and fuzzy image,” wrote columnist Bill Howard in our Feb. 8, 2005 issue(Opens in a new window).
“In notebooks, IBM came from nothing,” continued Howard. “Its pre-ThinkPad L40SX of 1992 was so inferior, it was all we could do not to laugh when IBM showed it off in our labs.”
“In recent years, IBM made solid, secure desktop machines,” continued Howard. “But computers didn’t improve the company’s ability to sell services and consulting.”
Howard’s eulogy was premature, as the ThinkPad lived on under Lenovo, but IBM’s exit from the PCs business was also the end of an era for PC Magazine. It began explicitly as an IBM PC fan magazine, but IBM no longer made computers. PCMag had grown far beyond those roots but continued to cover the ThinkPad under new management.
“Nothing epitomizes classic more than a ThinkPad.”
In 2012, reviewer Joel Santo Domingo praised the ThinkPad X1 Carbon—an early adopter of carbon-fiber materials for a stronger, lighter design. “It effortlessly collects yet another Editors’ Choice award as the most desirable executive notebook on Earth,” wrote contributor Eric Grevstad of the ninth-generation Carbon, praising its “flawless design and engineering.” Even 2022’s “huge and heavy” ThinkPad P17 Gen 2 earned our praise for being “the ultimate mobile workstation.”
The ThinkPad X1 Fold recalls the earliest ThinkPad design.
Though the iMac and iPhone may have surprised PC Magazine, the ThinkPad did not. We immediately recognized quality and forward-looking engineering. It’s to the credit of both the ThinkPad designers and years of PCMag reviewers that neither have been too afraid of bold, imaginative machines, provided they still fit inside a plain, black square. Even when it folds in half, the ThinkPad is still a ThinkPad.
The iPhone: A Phone That’s Not for Calling People
Similar to the iMac, the first mention of an Apple iPhone predates the phone itself. In our Dec. 26, 2006(Opens in a new window) issue, reviewer Sascha Segan boldly proclaims, “We want the iPhone.”
“Of course, rumors of an ‘iPhone’ (an Apple-branded cell phone that would revolutionize the way we use mobile devices) have been swirling around for years now.”
Segan’s desire for an Apple phone was so strong, he converted a Motorola SLVR into a makeshift iPhone(Opens in a new window). Key to this was hacking the SLVR’s version of iTunes, Apple’s media player, to contain more than 100 songs. Segan even rebranded the device, grinding off the old logo with a sugar cube.
Despite all that buildup, the actual iPhone announcement received almost begrudging coverage from PC Magazine. Editorial Director Jim Louderback noted in our Feb. 20, 2007(Opens in a new window) issue that the iPhone was the biggest news of CES—despite not being announced at CES.
“Steve Jobs rolled out the iPhone, which struck me initially as Newton 2.0—14 years later”
Even Segan was skeptical of the original iPhone’s significance. “Apple’s new phone seems to promise an iPod-like revolution, with a ground-breaking interface that turns information into a physical thing you can pinch, grab, and stretch,” wrote Segan in the March 6, 2007(Opens in a new window) issue. “But I think the iPhone will be more of a Mac: a cult item that will influence, rather than dominate, the industry.”
The debate about the iPhone’s viability reached its peak in the August 7, 2007 issue, which featured dueling op-eds at the front and back of the issue. First, Louderback argued(Opens in a new window) that initial iPhone sales would be strong, but that it wouldn’t last.
“You’re going to be really cool the first month or so with that iPhone. But after you go to a party and three other people have one, it’ll seem less alluring,” he wrote. “The true trendsetters will move on quickly (to Helio’s Ocean or Nokia’s N95), leaving the iPhone to those with more money than taste.”
Editor Lance Ulanoff had an opposite take(Opens in a new window), arguing that initial iPhone sales would be poor, in part because few people would be willing to break their wireless contracts in order to get new hardware, but that its fortune would change.
“The lack of a keyboard will turn off some customers, especially those who grew up with rotary and push-button phones,” wrote Ulanoff. “It will, on the other hand, attract all the young, trendy, flexible, iPod-loving customers Apple cares about.”
“In very short order, the iPhone will become the ultimate cell-phone status symbol, pushing aside RAZR and Sidekick.”
It wasn’t until Aug. 21, 2007, that PCMag finally ran its review of the iPhone. Heralding it as, “the most overhyped product(Opens in a new window) of the decade,” PCMag was nonetheless impressed, giving it four stars(Opens in a new window). “With its groundbreaking interface, the Apple iPhone is the best portable media player ever made, and it browses the Web like a champ. We’ve never had this much fun testing a handheld,” wrote reviewers Segan and Tim Gideon.
Our reviewers were particularly taken with the iPhone’s then-unique touch interface. “Make no mistake—using your fingers to zoom, skip, crop, and edit is sheer joy,” they wrote. “Pinching and sliding through the menus is just as cool as the commercials make it seem.”
The first iPhone was not a slam dunk, however. For one thing, “call quality was the worst we’ve heard on a high-end device,” wrote Gideon and Segan. Still, it’s telling that PC Magazine devoted two entire paper pages to the device. The Nokia E61i may have taken the Editors’ Choice award, but it received just half a page.
Opening the iPhone to app developers changed everything. Starting with the iPhone 3G, it was clear that the iPhone’s killer app, to use an outdated phrase, wasn’t one app but the constellation of third-party apps that grew for iOS. At the time, the App Store offered just 500 apps but would benefit from a wave of development for the new platform. Then–Executive Editor Dan Costa rightly sensed(Opens in a new window) that iOS devices would be “the future of portable gaming.”
In June 2008(Opens in a new window), Ulanoff called the iPhone “the most important product of the still-young 21st century.” The device had become an everything-machine. “If [people] can get all of this from something that fits into their pocket, then why have a PC at all?”
“The possibilities are endless—as they should be for whatever succeeds desktop computing.”
Over time, Apple addressed many of PCMag’s initial complaints but also garnered new ones. For example, starting with the iPhone 7, Apple removed the 3.5mm headphone jack.
“Apple says its steps this year are ‘courageous,’ and the company is on the right side of history,” wrote Segan, who conceded that other wired or wireless audio options probably were the direction of the industry. “But that doesn’t mean you have to be a day-one adopter of the courageous new technology.”
iPhones no longer require this degree of DIY effort.
The most recent incarnations of the iPhone have fully embraced the idea that a mobile device is your primary device. When writing about the super-sized iPhone XS Max in 2018, Segan declared it “the best expression of Apple’s smartphone philosophy so far, with a giant, gorgeous screen connecting you to everything.”
Like the iMac, the iPhone took technology that had long been in the hands of businesses and enthusiasts and made it desirable on a larger scale. Unlike the iMac, the iPhone has stayed at the forefront of its market. Of all the futuristic and mundane devices PCMag has seen, this class of hyper-connected handheld computers has had the greatest impact—and it started as the most overhyped device of the decade.
Android: Google Takes Over the World
Compared with the iPhone, Google’s Android mobile operating system had a far humbler debut. Editor-in-Chief Lance Ulanoff recognized in February 2008(Opens in a new window) that Google would likely continue dabbling outside its core search business and that other manufacturers would churn out new Android phones, “none of them the dreamed-about Google phone. Google will never produce a phone of its own.”
Lead Analyst Sascha Segan made equally prophetic statements in May of 2008(Opens in a new window). “Though the user interface on the demo phones was mundane, Google has said that manufacturers and mobile carriers will be able to customize the operating system heavily,” wrote Segan. “That could make Android an unusually diverse platform, or just another way for carriers to deliver locked-down services.”
It would be both.
An actual Android phone would not debut until late 2008(Opens in a new window) and wouldn’t make it into PC Magazine until January 2009(Opens in a new window), in the form of the T-Mobile G1, manufactured by HTC. Its large screen slid back dramatically to reveal a QWERTY board, but the real oddities were south of the display. The bottom of the phone had a significant angle to it—like a chin—and a roller ball in addition to its touch screen. In a column, Segan called the G1(Opens in a new window) “basically a less expensive, less totalitarian version of the iPhone.”
In his review (which was truncated in print but persists, in full online), Segan doesn’t have a lot to say about the physical weirdness of the G1, aside from noting that the trackball is quite useful for selecting tiny links in the mobile web browser and that the device lacks a headphone jack long before Apple’s fit of courage.
The first review of an Android phone would appear in the last print issue of PCMag.
In its debut, modern readers will recognize much of what still defines Android today. Apps are contained in a hidden “drawer,” and the three action buttons (physical buttons on the G1) let you interact with the device. It was also customizable. “Yes, this home screen is completely configurable—you can even throw out the phone dialer if you want,” wrote Segan.
Segan noted that the G1 lacked crucial features—like a video player—but saw a solution in Google’s app store. Despite the fact that the G1 was not yet for sale, Segan wrote that he saw new apps being added daily.
“Keep an eye on the T-Mobile G1, and on Android in general. I think it’ll grow on (and with) you.”
Ulanoff’s prediction that Google would never build its own Android phone remained true for several more years. Strangely, the first piece of Google-designed Android hardware wasn’t a phone or a tablet but a tedious orb used for streaming music. The Nexus Q was never released, and perhaps that’s a good thing. We noted that its design would “likely spawn a type of streaming media etiquette that will probably only occasionally deteriorate into fistfights.”
In 2016, proudly branded as “made by Google,” the search giant launched the Pixel, the first true Google phone. But by that point, Android had already become the go-to operating system for handset manufacturers. In his review of Android Jelly Bean, Then-Lead Analyst Jamie Lendino noted that Android was now the most popular mobile OS on the planet. That created a new problem that would dog Android for years: fragmentation. Handset manufacturers and wireless carriers exerted extreme control over which devices received updates and which did not.
Fragmentation also exacerbated Android’s other weakness, which was patching critical vulnerabilities. “A lack of updates can leave devices vulnerable to security issues like the Stagefright exploit,” wrote Analyst Ajay Kumar in 2015. “which requires Android 5.1.1, build LMY48I to fix—something that won’t roll out to other Android devices for weeks (or months), if they get it all.”
And that’s not to mention privacy. As more people became aware of the invasive nature of surveillance capitalism, Android lost some luster compared with Apple. “Since Google’s entire business model revolves around gathering information about each user,” wrote Lead Analyst Michael Muchmore, “it’s hard for the search ad company to compete with Apple, whose profit model doesn’t involve surveillance or profiling.”
Samsung Galaxy Z Fold 2
It’s easy to focus on the platform war between Android and iOS, but the true legacy of Android is that it made smartphones flourish. For a while, there was nothing like the iPhone. But Android allows for myriad choices: cheap phones, small phones, expensive phones, expensive phones that fold in half, and so on.
“I’ve believed for years that Android is the best mobile OS for the most people,” wrote Segan in his 2014 review of Android Lollipop. “Google’s policy of enabling dozens of manufacturers to make smartphones at every price point has transformed the world, putting the Internet in the hands of hundreds of millions of people who wouldn’t otherwise have it.”
The Internet: Communications Go Digital—and Global
When I started researching this story, it was important to me to find something that tied modern PCMag back to its earliest roots. The task proved more difficult than I expected. For one thing, our first issues were mostly about the IBM PC—so more current pickings were slim. For another, many of the earliest companies we wrote about simply aren’t around today.
But I did find one through-line that runs through all the products in this story. It’s this thing people did when they connected their computers to other computers over a network. This was not the internet—not yet. The first appearance of the term “internet” in PC Magazine came in a definition of Internet Protocol (or IP) in 1983(Opens in a new window), and it’s the only instance of the word in an issue that spans more than 500 pages. It wasn’t until 1989 that the word started appearing more regularly, still fewer than 10 times over an entire year of issues.
Despite that, our second issue(Opens in a new window) contains a foray into what is recognizable as the modern internet. It was called “computer conferencing.” Over the course of a luxuriously illustrated story, Communications Editor Clifford Barney described how it worked: Several individual computers would contact a host computer. Individual users could create, view, and edit files, with the caveat that only one user could access a given file at a time. Importantly, this system is asynchronous; users come and go at their leisure.
“Computer conferencing is not a substitute for anything, but is an entirely new form of group interaction.”
Beyond document editing, computer conferencing at one point was compared to an actual in-person conference in which one person at a time can present to a gathered audience. It may be thanks to pandemic living, attending multiple daily Zoom meetings, and participating in conferences such as RSAC and Black Hat through weird online systems, but this idea felt immediately familiar to me.
Barney wrote that ARPAnet was the largest computer conferencing network available at the time. He also noted that computer conferencing occasionally attracted an unseemly element, perhaps our first discussion of hackers. ARPAnet, Barney wrote, was restricted to DoD users and contractors, but though “outsiders may slip in through a few semi-legal ‘gateways,’ they do not normally have access to the full system.”
“Computer conferencing is such a mysterious animal that there is a great temptation to begin by describing what it isn’t. It isn’t electronic mail, for instance. And it isn’t back-and-forth on-line messaging, like a written-out telephone call. And it certainly isn’t video conferencing.”
Part of why computer conferencing is hard to grasp for a modern reader is that it’s a nebulous collection of functions and ideas. In 1982, the internet was squishy, undefined. PC Magazine recognized that it was important but was uncertain about what it was for.
A more familiar online application is a service called The Source, which Stuart R. Schwartz and Ellen Wilson reviewed in PC Magazine’s(Opens in a new window) third issue. The Source was primarily an “information library,” with articles available for reading, but it was also much more, including rudimentary email, chat, scrolling news, and even a listing of the bills in front of the US Congress.
“I soon recognized the communication possibilities inherent in this new medium,” wrote Schwartz. “A user can receive news, transmit text, reproduce documents, and rapidly communicate information to a specialized interest group that know each other only through the electronic services.”
The Source was Wikipedia, Gmail, and Twitter rolled together, but it wasn’t free. A subscription with cost a one-time $100 fee plus hourly rates that rose and fell depending on when you used the service. $4.25 got you an hour of time at midnight, but “prime time” access could cost $18 an hour.
“With each passing month it seems more evident that my green screen will become an expanding window on the world.”
Unsurprisingly, the challenges of human interaction were amplified by taking them online. “I had read about CHAT in the user’s manual for The Source, but it didn’t prepare me for the reality of this stranger coming into my home electronically. I was shocked but managed to respond,” wrote Schwartz and Wilson.
“The general novelty of CHAT soon wears off,” the story continued. “After three weeks I was an old-timer. There are just so many ways you can ask someone, ‘What kind of computer do you have?'”
Unlike the modern web, which is designed to keep us doomscrolling, The Source couldn’t be enjoyed for long stretches. “I doubt very much, however, that such services will supplant the printed page. There’s an optimum amount of time one can spend watching information scroll across the screen. For me it seems to be about 60 minutes at a sitting,” wrote Schwartz and Wilson.
In its early writing about the internet, we see PC Magazine at its best. Curious about something new, we recognized the potential and imaginative possibilities of new technology but balanced it against the harsh realities of cost and practicality. As the very definition of the internet was still forming, PCMag saw the potential and even some of the consequences that feel all too real today.
The Source and computer conferencing feel quaint compared with our modern experience. The internet has brought people together and connected them with more information than ever, but it’s also become a haven for disinformation that drives people apart. Even when we walk away from our computers and phones, we quickly reconnect with a smart TV, a streaming box, or a self-driving car. While we’re often enamored with new internet distractions, it’s also a burden.
“Though smartphones don’t take up much space, they do create emotional clutter,” wrote Senior Features Writer Chandra Steele in 2022. “Social media fights and memories that fill up the memory on our gadgets can hold us back.”