Browse IS/STAG - Portál ZČU

Skip to page content
Website ZČU
Portal title page ZČU
Anonymous user Login Česky
HelpDesk - user support contact
Browse IS/STAG
Login Česky
HelpDesk - user support contact
  • My info
  • Study
My portal
Welcome
Webmail JIS
JISSouhlas koloběžky
Browse IS/STAG Applicant
Information for applicantsElectronic applicationECTS arrivalsCourse catalog
Graduate
Getting startedAlumni ClubAbsolvent - website
Courseware
CoursewareCourses by Faculties

1st level navigation

  • My info
  • Study

2nd level navigation

  • Browse IS/STAG
  • Applicant
  • Graduate
  • Courseware
User disconnected from the portal due to long time of inactivity.
Please, click this link to log back in
(sessions are disconnected after 240 minutes of inactivity. Note that mobile devices may get disconnected even sooner).

Browse IS/STAG (S025)

Help

Main menu for Browse IS/STAG

  • Programmes and specializations.
  • Courses
  • Departments
  • Lecturers
  • Students
  • Examination dates
  • Timetable events
  • Theses, selected item
  • Pre-regist. study groups
  • Rooms
  • Rooms – all year
  • Free rooms – Semester
  • Free rooms – Year
  • Capstone project
  • Times overlap
  •  
  • Title page
  • Calendar
  • Help

Search for a Thesis

Print/export:  Data export to PDF format - which you can print easily... Bookmark this link in your browser so that you may quickly load this IS/STAG page in the future.
Not logged-in user will see only submitted theses.
Only logged-in user will see student personal numbers.

Dates found, count: 1

Search result paging

Found 1 records Print Export to xls List URL
  Surname Name Title Thesis status   Supervisors Reviewers Type of thesis Date of def. Title
Student Type of thesis - - - - - - - - - -
Item shown in detail BULÍN Includes the selected person into the timetable overlap calculation. Martin Optimization of neural network Optimization of neural network Thesis finished and defended successfully (DUO).   Šmídl Luboš Švec Jan Master's thesis 1497909600000 20.06.2017 Optimization of neural network Thesis finished and defended successfully (DUO).
Martin BULÍN Master's thesis 0XX 0XX 0XX 0XX 0XX 0XX 0XX 0XX 0XX 0XX

Thesis info Optimization of neural network

  • Basic data
The document you are accessing is protected by copyright law. Unauthorised use may lead to criminal sanctions.
Name BULÍN Martin Includes the selected person into the timetable overlap calculation.
Acad. Yr. 2016/2017
Assigning department KKY
Date of defence Jun 20, 2017
Type of thesis Master's thesis
Thesis status Thesis finished and defended successfully (DUO). Thesis finished and defended successfully (DUO).
Completeness of mandatory entries - All mandatory fields for this Thesis are filled in.
Main topic Optimalizace neuronové sítě
Main topic in English Optimization of neural networks
Title according to student Optimization of neural network
English title as given by the student Optimization of neural network
Parallel name -
Subtitle -
Supervisor Šmídl Luboš, Ing. Ph.D.
Reviewer Švec Jan, Ing. Ph.D.
Annotation Neural networks can be trained to work well for particular tasks, but hardly ever we know why they work so well. Due to the complicated architectures and an enormous number of parameters we usually have well-working black-boxes and it is hard if not impossible to make targeted changes in a trained model. In this thesis, we focus on network optimization, specifically we make networks small and simple by removing unimportant synapses, while keeping the classification accuracy of the original fully-connected networks. Based on our experience, at least 90% of the synapses are usually redundant in fully-connected networks. A pruned network consists of important parts only and therefore we can find input-output rules and make statements about individual parts of the network. To identify which synapses are unimportant a new measure is introduced. The methods are presented on six examples, where we show the ability of our pruning algorithm 1) to find a minimal network structure; 2) to select features; 3) to detect patterns among samples; 4) to partially demystify a complicated network; 5) to rapidly reduce the learning and prediction time. The network pruning algorithm is general and applicable for any classification problem.
Annotation in English Neural networks can be trained to work well for particular tasks, but hardly ever we know why they work so well. Due to the complicated architectures and an enormous number of parameters we usually have well-working black-boxes and it is hard if not impossible to make targeted changes in a trained model. In this thesis, we focus on network optimization, specifically we make networks small and simple by removing unimportant synapses, while keeping the classification accuracy of the original fully-connected networks. Based on our experience, at least 90% of the synapses are usually redundant in fully-connected networks. A pruned network consists of important parts only and therefore we can find input-output rules and make statements about individual parts of the network. To identify which synapses are unimportant a new measure is introduced. The methods are presented on six examples, where we show the ability of our pruning algorithm 1) to find a minimal network structure; 2) to select features; 3) to detect patterns among samples; 4) to partially demystify a complicated network; 5) to rapidly reduce the learning and prediction time. The network pruning algorithm is general and applicable for any classification problem.
Keywords network pruning, minimal network structure, network demystification, weight significance, removing synapses, network pathing, feature energy, network optimization, neural network
Keywords in English network pruning, minimal network structure, network demystification, weight significance, removing synapses, network pathing, feature energy, network optimization, neural network
Length of the covering note 66 s. (85 281 znaků)
Language AN
Annotation
Neural networks can be trained to work well for particular tasks, but hardly ever we know why they work so well. Due to the complicated architectures and an enormous number of parameters we usually have well-working black-boxes and it is hard if not impossible to make targeted changes in a trained model. In this thesis, we focus on network optimization, specifically we make networks small and simple by removing unimportant synapses, while keeping the classification accuracy of the original fully-connected networks. Based on our experience, at least 90% of the synapses are usually redundant in fully-connected networks. A pruned network consists of important parts only and therefore we can find input-output rules and make statements about individual parts of the network. To identify which synapses are unimportant a new measure is introduced. The methods are presented on six examples, where we show the ability of our pruning algorithm 1) to find a minimal network structure; 2) to select features; 3) to detect patterns among samples; 4) to partially demystify a complicated network; 5) to rapidly reduce the learning and prediction time. The network pruning algorithm is general and applicable for any classification problem.
Annotation in English
Neural networks can be trained to work well for particular tasks, but hardly ever we know why they work so well. Due to the complicated architectures and an enormous number of parameters we usually have well-working black-boxes and it is hard if not impossible to make targeted changes in a trained model. In this thesis, we focus on network optimization, specifically we make networks small and simple by removing unimportant synapses, while keeping the classification accuracy of the original fully-connected networks. Based on our experience, at least 90% of the synapses are usually redundant in fully-connected networks. A pruned network consists of important parts only and therefore we can find input-output rules and make statements about individual parts of the network. To identify which synapses are unimportant a new measure is introduced. The methods are presented on six examples, where we show the ability of our pruning algorithm 1) to find a minimal network structure; 2) to select features; 3) to detect patterns among samples; 4) to partially demystify a complicated network; 5) to rapidly reduce the learning and prediction time. The network pruning algorithm is general and applicable for any classification problem.
Keywords
network pruning, minimal network structure, network demystification, weight significance, removing synapses, network pathing, feature energy, network optimization, neural network
Keywords in English
network pruning, minimal network structure, network demystification, weight significance, removing synapses, network pathing, feature energy, network optimization, neural network
Research Plan
  1. Nastudujte problematiku trénování neuronových sítí pro klasifikaci a možnost optimalizace/prořezávání počtu parametrů sítě.
  2. Navrhněte algoritmus pro analýzu a prořezání parametrů neuronové sítě.
  3. Připravte vhodné testovací úlohy.
  4. Navržený algoritmus vyhodnoťte a výsledky porovnejte s referenčními modely.
Research Plan
  1. Nastudujte problematiku trénování neuronových sítí pro klasifikaci a možnost optimalizace/prořezávání počtu parametrů sítě.
  2. Navrhněte algoritmus pro analýzu a prořezání parametrů neuronové sítě.
  3. Připravte vhodné testovací úlohy.
  4. Navržený algoritmus vyhodnoťte a výsledky porovnejte s referenčními modely.
Recommended resources [1] Mozer, M. C., Smolensky, P. Skeletonization - a Technique for Trimming the Fat From a Network via Relevance Assesment. CU-CS-421-89" Computer Science Technical Reports. 1989.\\ [2] Karnin, E. D. A Simple Procedure for Pruning Back-Propagation Trained Neural Networks. IEEE Transactions on Neural Networks. 1990.\\ [3] Lecun, Y., Denker, J. S., Solla, S. A. Optimal Brain Damage. Advances in Neural Information Processing Systems. 1990.\\ [4] Psutka, J., Müller, L., Matoušek, J., Radová, V. Mluvíme s počítačem česky. Academia, Praha, 2006.
Recommended resources
[1] Mozer, M. C., Smolensky, P. Skeletonization - a Technique for Trimming the Fat From a Network via Relevance Assesment. CU-CS-421-89" Computer Science Technical Reports. 1989.\\ [2] Karnin, E. D. A Simple Procedure for Pruning Back-Propagation Trained Neural Networks. IEEE Transactions on Neural Networks. 1990.\\ [3] Lecun, Y., Denker, J. S., Solla, S. A. Optimal Brain Damage. Advances in Neural Information Processing Systems. 1990.\\ [4] Psutka, J., Müller, L., Matoušek, J., Radová, V. Mluvíme s počítačem česky. Academia, Praha, 2006.
Týká se praxe No
Enclosed appendices -
Appendices bound in thesis graphs, tables
Taken from the library Yes
Full text of the thesis
Thesis defence evaluation Excellent
Appendices
Reviewer's report
Supervisor's report
Defence procedure record -
Defence procedure record file