Video card

VR’s high fidelity obsession is holding it back

The virtual reality industry needs to temper its obsession for all things high fidelity.

Last November, I pointed to underwhelming content as the biggest threat to the VR industry. You know, the kind of experiences that woo you with novelty but fail to hook you in for another ride — which is the story for the majority of VR content. And earlier this month, I also covered WebVR gaming and have tried to push it forward as the sweet spot that developers and publishers should be prioritizing. But for that happen we need to address the elephant in the room: This obsession with high fidelity and accompanying misconceptions about what users need or want from VR, as well as what makes VR content effective.

This insistence on high fidelity has been woven, and by design, into the very fabric of how creators “should” frame the value proposition of immersive content. But it conveniently ignores great low fidelity VR cases that dispel this myth like A-Blast by Mozilla.

“High fidelity is simply the wrong focus,” WebVR developer and AFrame contributor, Fabien Benetou says. “There is a marketing hype from hardware vendors to showcase the latest features their graphic cards can support. Those visual demos are indeed visually always amazing but they are just tools for sales. It sounds very cool to say you have “the best” when in truth we all only need “good enough.”

The current push for VR, at least this time around, is thanks to some very big tech players like HTC and giant gaming engines like Unity that have a vested interest in trying to shape an industry as it emerges. The idea is to hinge the emerging tech on their choice of hardware and platforms, and so secure their relevancy for the next wave to come. One of the key ways to do that is to play on their strength, which is premium performance, and to therefore position premium at the center of the VR value proposition.

The industry, however, responds with its own impartial terms and takes shape according to actual market realities. That’s why we need to re-think our assumptions on what good or effective content is or can be, unless we want to waste a lot of time and resources pushing the wrong recipes.

In other words, it’s best to use an iterative approach that discovers what makes VR content stick.

Above: A-Blast debuted to dozens of Mozillians in December 2016 at the Mozilla All-Hands event in Hawaii.

But instead, we’ve seen the market jump into the deep end with high fidelity as the operative frame that consumers are expected to adopt. The positioning is all wrong because the landscape is still fuzzy, which leaves a lot of game and content studios out in the wilderness, in an awkward position in terms of financials, forecasts, and a scary gap between the two.

“For a nascent medium like VR whose interaction and cinematic vocabulary is evolving, creatively we need rapid experimentation and feedback: people need to be able to prototype their ideas quickly, share their experiments with a large population…

How to Use Mulitple External Monitors With Your Laptop

Multiple monitors are awesome. They really are—ask anyone who’s used a two- or three-screen setup for their desktop, and they’ll tell you that they have a hard time going back to just one. Laptops have a built-in advantage here, since they have one screen: to boost productivity, just add a monitor.

But what if you want more than one screen hooked up to your notebook at once? What if your laptop lacks a bunch of external video ports? What if you’re travelling, and you can’t lug around a full-sized monitor? Don’t worry, you still have more options than you might think.

The Ideal Solution for Newer Laptops: Thunderbolt

Thunderbolt 3, which uses the new USB Type-C connector standard, is the newest way for laptops and tablets to output video. The advantages are obvious: a single cable can handle video, audio, standard data transmission (for external hard drives or a wired Internet connection) and power, all at the same time. Not only does this reduce clutter on your desk—assuming you have the hardware to take advantage of it, of course—it means laptops can be made smaller and thinner by consolidating ports.

So, if you have a laptop with Thunderbolt 3 and a Thunderbolt-capable monitor, this is by far the best solution. You can just hook up each monitor to one Thunderbolt/USB-C port.

However, it’s rarely that simple. Unless you have a very new laptop and very new monitors, you’ll probably need a bit more to make this work:

  • If you have a laptop with multiple Thunderbolt/USB-C ports but older monitors that don’t have Thunderbolt input, you’ll need some sort of adapter for each monitor, like this USB-C to HDMI or this USB-C to DVI adapter. Remember, you’ll need one adapter for each monitor you’re connecting.
  • If your laptop only has one Thunderbolt/USB-C port, you’ll likely need some sort of docking station to connect two monitors to one port. We recommend checking out this Dell Thunderbolt Dock, though there are others out there as well. Note that some laptops, like the small one-port MacBook, do not support running multiple displays from one port using these docks, so check your laptop’s specifications, and if you’re going to try a dock, buy from a store with a good return policy in case it doesn’t work.

Thunderbolt has a massive amount of video bandwidth, and it’s more that capable of supporting multiple standard monitors (the new Macbook Pros can output to two 5K displays at once, so long as you have the right adapters). Specialized adapters—basically mini-laptop docks—are designed for the purpose of regular docking to a multi-monitor setup with mice, keyboard, and other connections.

Once USB-C and Thunderbolt become more common on laptops and…

What All of Your Computer’s Specs Really Mean

Computer specs can be a baffling mix of acronyms and numbers at the best of times, but it’s worth learning something about them: It’ll help you choose a new computer, troubleshoot your old computer, and generally understand more about the relationship between the specs on the page and the experience you’re getting.

Such is the complexity of the modern-day computer, we could’ve written an article twice this size on any one of the categories listed below (look at any graphics card forum for proof)—but the main aim here is to help you understand the specs you see listed with desktops and laptops, and give you an idea of the difference they make to performance.

CPU

The Central Processing Unit, or CPU, or processor, is the brains of the operation: it handles all those calculations that keep your computer actually working. The CPU inside your machine is the main (but not the only) contributor to its overall speed and performance.

CPUs have a certain number of cores, mini computing units that are effectively CPUs in their own right—they let your computer work on multiple tasks at the same time, so the more cores the better. On top of this, each core has a clock speed, a measurement of how fast it can do its number crunching, usually measured in gigahertz (GHz).

Comparing the performance of CPUs based on core number and clock speeds is notoriously difficult (sorry shoppers). That’s because multiple factors are involved, most related to the microarchitecture of the CPUs. The microarchitecture is basically the way that the cores and the other bits of a CPU are packed together.

The two big computer CPU makers, Intel and AMD, have their own microarchitecture designs. When you see references to Intel Skylake, Intel Kaby Lake, or AMD Zen (on Ryzen chips), this is what’s being referred to, and newer is always better as successive microarchitectures allow the CPU to work faster and more efficiently (and use less power).

Intel and AMD also apply their own labels—i3, i5, and i7 in Intel’s case—to indicate relative performance within a microarchitecture family. It’s a useful shorthand reference to the power you can expect, with i7 CPUs the best of the bunch from Intel. In AMD’s case, you’re talking about Ryzen 3, Ryzen 5 and the top-end Ryzen 7.

If you want the very best processors around, you should also look out for what Intel calls hyper-threading and what AMD calls simultaneous multi-threading. These technologies effectively double the number of cores (virtually, not physically) so you’ve got significantly improved performance for demanding applications like video editing or CAD software.

Unless you’re building your own PC from scratch, that’s probably all you need to know when looking at system listings, but CPUs have numerous other specs, including the amount of high-speed memory cache and the extra graphics processing capabilities that are on board. If your CPU has enough integrated graphics oomph, you don’t need a separate card or chipset, of which more below.

Graphics

The other big factor in computer performance, particularly if you’re gaming or working with a lot of video and images, is graphics.

We only gave it a brief mention in the processor section, but many Intel CPUs now come with a decent amount of graphics processing power built in, enough for most users to get by with a bit of web browsing, Twittering, essay writing and even light image editing and gaming. You can also get integrated graphics chipsets built into the motherboard…

HDMI vs DisplayPort vs DVI: Which Port Do You Want On Your New Computer?

It doesn’t seem so long ago that we had only one reliable way to connect a computer to an external monitor. Now the good old VGA port, may it rest in peace, is only found on designated “business” machines and adapters. In its place, we have a variety of alternatives, all of which seem to be fighting each other for the limited space on your laptop or graphics card. Let’s break down the options for your next PC purchase.

HDMI

HDMI is the most widely-used of the three options here, if only because it’s the de facto standard for anything connecting to televisions. Because of its wide adoption, HDMI is also included on most recent monitors and many laptops, except for the smallest ultraportable models. The acronym stands for “High Definition Multimedia Interface.”

The standard has been around since the early 2000s, but determining its capabilities is a bit tricky, because it’s gone through so many revisions. The latest release is HDMI 2.1, which supports a staggering 10K resolution (more than 10,000 pixels wide) at 120 hertz. But version 2.1 is just starting to appear in consumer electronics; the latest laptops that feature HDMI ports will probably top out at version 2.0b, which supports 4K video at 60 frames per second with high dynamic range (HDR).

HDMI’s biggest advantage over the older DVI standard is that it also carries and audio signal, allowing users to plug into a TV (or a monitor with built-in speakers) with a single cable. This is great for TVs, but most monitors still lack integrated speakers, so you’ll also have to use a more conventional headphone jack or simply rely on your laptop’s built-in speakers much of the time.

HDMI comes in three primary connection sizes: standard, “Mini,” and “Micro,” getting progressively smaller. The Mini and Micro connections are popular with smaller portable electronics, but if your laptop has an HDMI port, it probably uses the full-sized version. This, combined with a wide variety of compatible monitors and televisions, makes HDMI the most convenient external display option for most users.

DisplayPort

DisplayPort is a bit newer than HDMI, though it’s also a proprietary system. The full-sized plugs look similar, but DisplayPort uses a asymmetrical notched design versus HDMI’s equal trapezoid.

As competing standards, they share a lot of features in their various incarnations. DisplayPort can also carry audio signals on a single cable, and the latest release supports up to 8K resolution at 60 hertz with high dynamic range. The next version…

The Best All-In-One Windows PCs: Seriously, They’re Actually Good Now

All-in-one PCs are the domain of the novice, the hotel business nook, or the interior decorator who can’t stomach seeing a “real” PC in a pristine living room. With the exception of the iMac, they were seen as boring, underpowered boxes with laptop components stuffed behind a cheap screen. But that’s changing.

It’s true that all-in-one machines are mostly cheap and simple, but the form factor has been undergoing a quiet revolution for the last couple of years. While Apple has been comfortable to trim dimensions and call it a day, manufacturers like Microsoft, Lenovo, HP, and others are filling the space with new and exciting designs. You should really check some of these models out before making your next desktop purchase.

Microsoft Surface Studio

The first desktop machine from Microsoft’s self-branded hardware initiative, the Surface Studio is surely the poster child for this new generation of all-in-one machines. Combining a 28-inch touch screen, a fold-down artist’s easel hinge, and the much-praised Surface Pen from the tablet line, the Studio makes a compelling argument for Windows as an artist’s platform. Prices start high and go higher, but with a GTX 965 graphics card and an optional 980 upgrade, the all-in-one can also double as a competent gaming machine (albeit a not-very-upgradeable one).

The $3000 starting price (with a rather paltry 8GB of RAM, no less) and slightly older Intel processors are two bummers in an otherwise amazing hardware package. Ditto for the unique Surface Dial: this rotating wireless tool can be placed directly on the screen for digital manipulation, but it’s a separate $100 purchase and currently limited to only a few applications. Even so, for those who want the absolute cutting edge in desktop design, the Surface Studio might be worth its steep asking price.

HP Envy

The Envy series has long been HP’s showcase for its more bombastic designs, and the latest all-in-one machines to wear the badge are no exception. These desktops combine huge, small-bezel displays with a horizontal component body that integrates a quad-speaker Bang & Olufsen soundbar. At a glance, the design looks like a high-end home theater setup that’s been shrunk down to desktop size, and that’s basically what it is, with a mid-range Windows machine crammed into the package.

The latest Envy designs are also surprisingly affordable, considering their displays. The base configuration for the massive 34-inch model starts at around $1800, though those who want more RAM, a bigger SSD+HDD combo, and a more capable graphics card can spend a bit more. The Envy design also comes in 24-inch and 27-inch versions, some of which offer touch screens, which isn’t an option on the largest version.

Digital Storm Aura, CyberPower PC Arcus, and Origin Omni

Even among these next-gen designs, gamers looking for truly high-end graphics can find their options a bit limited, thanks to the tight packages and non-upgradable components. Boutique PC makers are getting around that by cramming a full desktop into a 34-inch ultrawide monitor, in three separate products that seem to come from the same OEM supplier: the Digital Storm Aura, the CyberPower PC Arcus, and the Origin Omni. Slip off the back cover and you’ll be able to swap out every component, including a massive full-size PCIe desktop graphics card, RAM DIMM slots, SSD and HDD storage bays, and yes, even the desktop-class Intel processor and Mini-ITX motherboard.

Can You Game on a Mac?

Macs have a lot of advantages. Maybe you like the simplicity of macOS, the sexy industrial design, or work in a creative field where they’re pretty much a requirement. But if you’re also a gamer, you may be wondering: can they handle the games you want to play as well as Windows?

Can You Play Games on a Mac?

Macs are made of the same components as any other PC. They’re just an Intel x86 computer in a fancier case with a different operating system. This means there’s no real hardware barrier to gaming on a Mac. It’s not like a PC has some magic video game component that your Mac lacks.

However, Macs aren’t exactly designed for gaming. The discrete graphics cards used in the high-end Macs aren’t all that great, and you don’t have the choice of the more powerful graphics cards you would in some Windows PCs. The Mac Pro is an exception, which carries a decent graphics card inside, but it’ll cost you a lot more than a comparable Windows PC would.

These graphics cards are also soldered in, so there’s no way to upgrade them a year or two down the line—even on desktops like the iMac or Mac Pro. Windows desktops are more upgradeable in this respect.

Entry level Macs don’t have dedicated graphics cards at all—they have integrated graphics chips that are even more asthmatic. They might reach the absolute minimum requirements of some popular modern games, but just barely.

There’s no way you’ll be able to play new games at full resolution with all the detail settings cranked up, even with a specced-out iMac—but they are technically capable of playing many games. Even a MacBook Air can play Minecraft. But, although it’s possible, is it worth doing?

A Mac is never going to be as good for gaming as a dedicated Windows PC, especially for the price. Even a Mac Pro can’t compete with a gaming-focused rig that costs a quarter of the Mac Pro’s $2999 price tag. If you’re serious about having the best gaming experience, your Mac isn’t going to cut it. Build your own gaming PC or buy a console and be done with it!

If you’re looking to casually play the occasional game, though, a Mac may suffice. I travel a lot, and only have my MacBook with me when I do. I’m away from my beloved PlayStation 4 for months at a time. My MacBook is able to give me a small gaming fix. It might be more methadone than heroin, but it’s something.

What Games Are Available?

The biggest issue with gaming on a Mac, though, is game availability. Windows’ DirectX APIs are incredibly popular with game developers. They don’t have any equivalents on macOS, which makes it harder for developers to port their games. Because…

What Is Coil Whine, and Can I Get Rid of It on My PC?

What Is Coil Whine?

On a pure technical level, coil whine refers to an undesirable noise emitted by an electronic component vibrating as power runs through an electrical cable. Just about anything with a power source can create coil whine to some degree, but it’s usually caused by an electrical current going through a power-regulating component like a transformer or inductor, causing its electrical wiring to vibrate at a variable frequency. This happens in almost all electrical devices, usually at a frequency and volume that’s inaudible to humans, especially inside a metal or plastic PC case.

An old-fashioned electromagnetic inductor in a radio. As the electrical current causes the coil to vibrate against the ring, an audible pitch may be heard.

But when you’re dealing with high-powered components in modern gaming PCs, especially the graphics card and power supply, these vibrations can be audible. This is especially true for anyone who’s sensitive to high-frequency noises. In bad cases, you can actually hear the pitch of the coil whine change as the GPU draws more or less power, and the electrical frequency across various components shifts. It might be particularly noticeable when running a 3D game or high-intensity graphics application. Coil whine can be especially noticeable—not to mention frustrating!—on otherwise “silent” PCs, like low-power home theater PCs or gaming PCs with a liquid cooling system.

Coil whine is really nothing to be concerned about. It can be annoying, of course, but it isn’t like a rattling engine or a squeaking wheel—the noise is a byproduct of your PC and graphics card’s normal operation. Your system isn’t losing any performance or longevity because of coil whine.

(Note: if you hear a distinct hissing or high-pitched whistling instead of a buzz or scratch, that might be the altogether different phenomenon known as “capacitor squeal.” This is something to be concerned about, since it indicates a failing component.)

What Can I Do About It?

Sadly, there isn’t an easy fix for coil whine, like an updated driver…