Project title

Making Deep neural network’s ready for Mobile/Edge devices

Submitted to:

  • Tejalal Choudhary

Submitted By:

  1. Devdatta Khoche
  2. Jinu Lilly Joseph
  3. Jagdeeh K
  4. Parth Deshmukh
  5. NSV kalyan

Project Description

With the advancements in the recent era of Deep Learning and AI, comes the need of having the state-of-the-art models on the edge devices, without loss in their originality and accuracy. This can be achieved by deploying these models on the cloud and using APIs, but this comes with a great cost of security and speed. Deep Learning models are becoming larger in size and computationally expensive, with integration of more technologies into them which can’t fit into the frugal memory of edge devices. Deploying the state-of-the-art DNNs in edge devices is limited by the following factors: — Computational power needed for inference and — Size of the model itself, hence quite not portable. We propose the redundant filter pruning method, which removes the redundant filters from each convolutional layer, and thus reduces the size of the model and the computations needed for the inference.

Project Poster

Get Latest Notification about

Please ignore if you have already signed up.

Announcements, news and innovations!

From in your inbox.

By submitting this form, you are consenting to receive marketing emails from: Bennett University. You can revoke your consent to receive emails at any time by using the SafeUnsubscribe® link, found at the bottom of every email.