Artificial Intelligence and Machine Learning usually work best with a lot of horsepower behind them to crunch the data, compute possibilities and instantly come up with better solutions.
That’s why most AI systems rely on local sensors to gather input, while more powerful hardware in the cloud manages all the heavy lifting of output. It’s how Apple’s Siri and Amazon Alexa work, and how IBM Watson can tackle virtually any major task. It is, though, a limiting approach when it comes to making smarter Internet of Things and applying intelligence when there isn’t Internet connectivity.
“The dominant paradigm is that these [sensor] devices are dumb,” said senior researcher with Microsoft Research India, Manik Varma.
Now, Varma’s team in India and Microsoft researchers in Redmond, Washington, (the entire project is led by lead researcher Ofer Dekel) have figured out how to compress neural networks, the synapses of Machine Learning, down from 32 bits to, sometimes, a single bit and run them on a $10 Raspberry Pi, a low-powered, credit-card-sized computer with a handful of ports and no screen. It’s really just an open-source motherboard that can be deployed anywhere. The company announced the research in a blog post on Thursday.
Microsoft’s work is part of a growing trend of moving Machine Learning closer to devices and end users.
Earlier this month at is annual
I have a crazy passion for #music, #celebrity #news & #fashion! I'm always out and about on Twitter.
Latest posts by Sasha Harriet (see all)
- The Cat Came Back - April 22, 2018
- Good News in History, April 22 - April 22, 2018
- Organize Your Entire Video Game Collection in One Place with LaunchBox - April 22, 2018
More from Around the Web