Skip to main content

Posts

Showing posts from January, 2018

Purdue-affiliated Startup FWDNXT Designing Low-power Hardware for Deep Learning

[caption id="" align="alignleft" width="343"] The "Snowflake" mobile coprocessor for Deep Learning from Purdue-affiliated startup FWDNXT (via HPCWire).[/caption] A Purdue affiliated startup called FWDNXT is trying to design low-power mobile deep learning. Eugenio Culurciello, an associate professor at Purdue, says that the "Snowflake" mobile coprocessor "is able to achieve a computational efficiency of more than 91 percent on entire convolutional neural networks..." I'm not exactly sure what he means by 91% computational efficiency. One of the biggest problems with deep learning right now is the last of agreed-upon benchmarks and means of measuring performance. This could mean 91% of peak theoretical efficiency for the chip in terms of single-precision FLOPs, or 91% of theoretical peak performance on this particular network architecture. Unfortunately, there just isn't any way to know. The original article is posted

Non-Disclosure Agreements

At some point, many developers and technologists will have the opportunity to be briefed on information that is not publicly available. Companies use non-disclosure agreements (NDAs) as a legal mechanism to make sure that trade secrets are actually kept secret.

Squashing commits in Git

If you're like me, then you tend to follow the ultra-conservative practice of "commit early, commit often" when using git . If so, I applaud you. It's important to be sure you are keeping track of your work, and if you push your commits back to GitHub, Bitbucket, or GitLab that's even better. No one wants to do a day's worth of software development work, only to wake up the next morning to a local hard drive failure.