Graphics processing unit

Neural Networks: You’ve Got It So Easy

Neural networks are all the rage right now with increasing numbers of hackers, students, researchers, and businesses getting involved. The last resurgence was in the 80s and 90s, when there was little or no World Wide Web and few neural network tools. The current resurgence started around 2006. From a hacker’s perspective, what tools and other resources were available back then, what’s available now, and what should we expect for the future? For myself, a GPU on the Raspberry Pi would be nice.

The 80s and 90s

Neural network 80s/90s books and mags
Neural network 80s/90s books and mags

For the young’uns reading this who wonder how us old geezers managed to do anything before the World Wide Web, hardcopy magazines played a big part in making us aware of new things. And so it was Scientific American magazine’s September 1992 special issue on Mind and Brain that introduced me to neural networks, both the biological and artificial kinds.

Back then you had the option of writing your own neural networks from scratch or ordering source code from someone else, which you’d receive on a floppy diskette in the mail. I even ordered a floppy from The Amateur Scientist column of that Scientific American issue. You could also buy a neural network library that would do all the low-level, complex math for you. There was also a free simulator called Xerion from the University of Toronto.

Keeping an eye on the bookstore Science sections did turn up the occasional book on the subject. The classic was the two-volume Explorations in Parallel Distributed Processing, by Rumelhart, McClelland et al. A favorite of mine was Neural Computation and Self-Organizing Maps: An Introduction, useful if you were interested in neural networks controlling a robot arm.

There were also short courses and conferences you could attend. The conference I attended in 1994 was a free two-day one put on by Geoffrey Hinton, then of the University of Toronto, both then and now a leader in the field. The best reputed annual conference at the time was the Neural Information Processing System conference, still going strong today.

And lastly, I recall combing the libraries for published papers. My stack of conference papers and course handouts, photocopied articles, and handwritten notes from that period is around 3″ thick.

Then things went relatively quiet. While neural networks had found use in a few applications, they hadn’t lived up to their hype and from the perspective of the world, outside of a limited research community, they ceased to matter. Things remained quiet as gradual improvements were made, along with a few breakthroughs, and then finally around 2006 they exploded on the world again.

The Present Arrives

We’re focusing on tools here but briefly, those breakthroughs were mainly:

  • new techniques for training networks that go more than three or four layers deep, now called deep neural networks
  • the use of GPUs (Graphics Processing Units) to speed up training
  • the availability of training data containing large numbers of samples

Neural Network Frameworks

There are now numerous neural network libraries, usually called frameworks, available for download for free with various licenses, many of them open source frameworks. Most of the more popular ones allow you to run your neural networks on GPUs, and are flexible enough to support most types of networks.

Here are most of the more popular ones. They all have GPU support except for FNN.


Languages: Python, C++ is in the works

TensorFlow is Google’s latest neural network framework. It’s designed for distributing networks across multiple machines and GPUs. It can be considered a low-level one, offering great flexibility but also a larger learning curve than high-level ones like Keras and TFLearn, both talked about below. However, they are working on producing a version of Keras integrated in TensorFlow.

We’ve seen this one in a hack on Hackaday already in this hammer and beer bottle recognizing robot and even have an introduction to using TensorFlow.


Languages: Python

This is an open source library for doing efficient numerical computations involving multi-dimensional arrays. It’s from the University of Montreal, and runs on Windows, Linux and OS-X. Theano has been around for a long time, 0.1 having been released in 2009.


Languages: Command line, Python, and MATLAB

Caffe is developed by Berkeley AI Research and community contributors. Models can be defined in a plain text file and then processed using a command line tool. There are also Python and MATLAB interfaces. For example, you can define your model in a plain text file, give details on how to train it in a second plain text file called a solver, and then pass these to the caffe command line tool which will then train a neural network. You can then load this trained net using a Python program and use it to do something, image classification for example.


Languages: Python, C++, C#

This is the Microsoft Cognitive Toolkit (CNTK) and runs on Windows and Linux. They’re currently working on a version to be used with Keras.


Languages: Python

Written in Python, Keras uses either TensorFlow or Theano underneath, making it easier to use those frameworks. There are also plans to support CNTK as well. Work is underway to integrate Keras into TensorFlow resulting in a separate TensorFlow-only version of Keras.

Languages: Python

Like Keras, this is a high-level library built on top of TensorFlow.


Languages: Supports over 15 languages, no GPU support

This is a high-level open source library written in C. It’s limited to fully connected and sparsely connected neural networks. However, it’s been popular over the years, and has even been included in Linux distributions. It’s recently shown up here on Hackaday in a robot that learned to walk using reinforcement learning, a machine learning technique that often makes use of neural networks.


Languages: Lua

Open source…

Can You Game on a Mac?

Macs have a lot of advantages. Maybe you like the simplicity of macOS, the sexy industrial design, or work in a creative field where they’re pretty much a requirement. But if you’re also a gamer, you may be wondering: can they handle the games you want to play as well as Windows?

Can You Play Games on a Mac?

Macs are made of the same components as any other PC. They’re just an Intel x86 computer in a fancier case with a different operating system. This means there’s no real hardware barrier to gaming on a Mac. It’s not like a PC has some magic video game component that your Mac lacks.

However, Macs aren’t exactly designed for gaming. The discrete graphics cards used in the high-end Macs aren’t all that great, and you don’t have the choice of the more powerful graphics cards you would in some Windows PCs. The Mac Pro is an exception, which carries a decent graphics card inside, but it’ll cost you a lot more than a comparable Windows PC would.

These graphics cards are also soldered in, so there’s no way to upgrade them a year or two down the line—even on desktops like the iMac or Mac Pro. Windows desktops are more upgradeable in this respect.

Entry level Macs don’t have dedicated graphics cards at all—they have integrated graphics chips that are even more asthmatic. They might reach the absolute minimum requirements of some popular modern games, but just barely.

There’s no way you’ll be able to play new games at full resolution with all the detail settings cranked up, even with a specced-out iMac—but they are technically capable of playing many games. Even a MacBook Air can play Minecraft. But, although it’s possible, is it worth doing?

A Mac is never going to be as good for gaming as a dedicated Windows PC, especially for the price. Even a Mac Pro can’t compete with a gaming-focused rig that costs a quarter of the Mac Pro’s $2999 price tag. If you’re serious about having the best gaming experience, your Mac isn’t going to cut it. Build your own gaming PC or buy a console and be done with it!

If you’re looking to casually play the occasional game, though, a Mac may suffice. I travel a lot, and only have my MacBook with me when I do. I’m away from my beloved PlayStation 4 for months at a time. My MacBook is able to give me a small gaming fix. It might be more methadone than heroin, but it’s something.

What Games Are Available?

The biggest issue with gaming on a Mac, though, is game availability. Windows’ DirectX APIs are incredibly popular with game developers. They don’t have any equivalents on macOS, which makes it harder for developers to port their games. Because…

How to Maximize Your Linux Laptop’s Battery Life

Laptop manufacturers spend a lot of time tuning their device drivers for Windows battery life. Linux usually doesn’t get the same attention. Linux may perform just as well as Windows on the same hardware, but it won’t necessarily have as much battery life.

Linux’s battery usage has improved dramatically over the years. The Linux kernel has gotten better, and Linux distributions automatically adjust many settings when you’re using a laptop. But you can still do some things to improve your battery life.

Basic Battery-Saving Tips

Before you do anything too complex, adjust the same settings you would on a Windows laptop or MacBook to maximize battery life.

For example, tell your Linux laptop to suspend—this is what Linux calls sleep mode—more quickly when you’re not using it. You’ll find this option in your Linux desktop’s settings. For example, head to System Settings > Power on an Ubuntu desktop.

Screen brightness can affect battery life dramatically. The brighter your display backlight, the worse your battery life will be. If your laptop has hotkeys to change screen brightness, try them—they’ll hopefully work on Linux, too. If not, you’ll find this option somewhere in your Linux desktop’s settings. It’s available at System Settings > Brightness & Lock on Ubuntu.

You can also tell your Linux desktop to turn off the screen more quickly when it’s inactive. The laptop will use less power when its screen is off. Don’t use a screensaver, as those just waste power by making your computer do more work and leaving the display on.

You can also disable hardware radios you don’t use. For example, if you don’t use Bluetooth, you can disable it to gain some more battery life. Head to System Settings > Bluetooth to disable Bluetooth on an Ubuntu desktop.

If you’re not using Wi-Fi, you can save a bit of power by disabling that, too. On Ubuntu, head to System Settings > Network and enable “Airplane Mode” to disable Wi-Fi and other wireless radios.

Remember that what you do with the laptop is also important. Running heavier software and using more CPU resources will cause your laptop to use more battery power. For this reason, you may want to look at a more lightweight desktop environment, such as the Lxde-based Lubuntu instead of the Unity-based main Ubuntu desktop.

Install Proprietary Graphics Drivers (If You Need Them)

If your laptop has integrated Intel graphics, congratulations. You shouldn’t need to worry about power management issues with your graphics drivers. Intel graphics aren’t the fastest, but they have excellent open-source driver support and “just work” out of the box.

If your laptop has NVIDIA or AMD graphics, however, you may need to do some work to decrease power consumption.

The worst case scenario is a laptop with NVIDIA Optimus or AMD’s switchable graphics. Such laptops have two different GPUs. For example, an NVIDIA Optimus laptop will have both a more powerful, battery-draining NVIDIA GPU and a less powerful, battery-friendly Intel GPU. On…