Exploiting data parallelism in artificial neural networks with Haskell
MetadataShow full item record
Functional parallel programming techniques for feed-forward artiﬁcial neural networks trained using backpropagation learning are analyzed. In particular, the Data Parallel Haskell extension to the Glasgow Haskell Compiler is considered as a tool for achieving data parallelism. We ﬁnd much potential and elegance in this method, and determine that a suﬃciently large workload is critical in achieving real gains. Several additional features are recommended to increase usability and improve results on small datasets.