ASICs (application-specific integrated circuits) are designed to perform a single function extremely efficiently and have been pushing bitcoin mining forward at an astonishing pace. This Quartz article profiles how Bitmain (large Chinese Bitcoin miner) are now focused on leveraging their ASIC design knowledge to create a chip that will have some of the most common deep learning algorithms etched in silicon on to it, thus greatly boosting deep learning efficiency.
Wang Jun, who heads the AI program under Zhan, has spent two years working on Bitmain’s deep learning chip. The idea is to etch in silicon some of the most common deep learning algorithms, thus greatly boosting efficiency. Users will be able to apply their own datasets and build their own models on these ASICs, allowing the resulting neural networks to generate results and learn from those results at a far quicker pace. This is a technique that Google’s DeepMind unit, based in London, used to train its AlphaGo artificial intelligence, using its own Tensor Processing Unit chips. Bitmain plans to sell these chips to any corporation that wants to train its own neural nets