• Home
  • Advanced Search
  • Directory of Libraries
  • About lib.ir
  • Contact Us
  • History
  • ورود / ثبت نام

عنوان
Deep learning through sparse and low-rank modeling /

پدید آورنده
edited by Zhangyang Wang, Yun Fu, Thomas S. Huang.

موضوع
Machine learning.,COMPUTERS-- General.,Machine learning.

رده
Q325
.
5

کتابخانه
Center and Library of Islamic Studies in European Languages

محل استقرار
استان: Qom ـ شهر: Qom

Center and Library of Islamic Studies in European Languages

تماس با کتابخانه : 32910706-025

INTERNATIONAL STANDARD BOOK NUMBER

(Number (ISBN
012813660X
(Number (ISBN
9780128136607
Erroneous ISBN
0128136596
Erroneous ISBN
9780128136591

TITLE AND STATEMENT OF RESPONSIBILITY

Title Proper
Deep learning through sparse and low-rank modeling /
General Material Designation
[Book]
First Statement of Responsibility
edited by Zhangyang Wang, Yun Fu, Thomas S. Huang.

.PUBLICATION, DISTRIBUTION, ETC

Place of Publication, Distribution, etc.
[Place of publication not identified] :
Name of Publisher, Distributor, etc.
Academic Press, an imprint of Elsevier,
Date of Publication, Distribution, etc.
[2019]
Date of Publication, Distribution, etc.
©2019

PHYSICAL DESCRIPTION

Specific Material Designation and Extent of Item
1 online resource

SERIES

Series Title
Computer vision and pattern recognition series

INTERNAL BIBLIOGRAPHIES/INDEXES NOTE

Text of Note
Includes bibliographical references and index.

CONTENTS NOTE

Text of Note
Front Cover; Deep Learning Through Sparse and Low-Rank Modeling; Copyright; Contents; Contributors; About the Editors; Preface; Acknowledgments; 1 Introduction; 1.1 Basics of Deep Learning; 1.2 Basics of Sparsity and Low-Rankness; 1.3 Connecting Deep Learning to Sparsity and Low-Rankness; 1.4 Organization; References; 2 Bi-Level Sparse Coding: A Hyperspectral Image Classi cation Example; 2.1 Introduction; 2.2 Formulation and Algorithm; 2.2.1 Notations; 2.2.2 Joint Feature Extraction and Classi cation; 2.2.2.1 Sparse Coding for Feature Extraction
Text of Note
2.2.2.2 Task-Driven Functions for Classi cation2.2.2.3 Spatial Laplacian Regularization; 2.2.3 Bi-level Optimization Formulation; 2.2.4 Algorithm; 2.2.4.1 Stochastic Gradient Descent; 2.2.4.2 Sparse Reconstruction; 2.3 Experiments; 2.3.1 Classi cation Performance on AVIRIS Indiana Pines Data; 2.3.2 Classi cation Performance on AVIRIS Salinas Data; 2.3.3 Classi cation Performance on University of Pavia Data; 2.4 Conclusion; 2.5 Appendix; References; 3 Deep l0 Encoders: A Model Unfolding Example; 3.1 Introduction; 3.2 Related Work; 3.2.1 l0- and l1-Based Sparse Approximations
Text of Note
3.2.2 Network Implementation of l1-Approximation3.3 Deep l0 Encoders; 3.3.1 Deep l0-Regularized Encoder; 3.3.2 Deep M-Sparse l0 Encoder; 3.3.3 Theoretical Properties; 3.4 Task-Driven Optimization; 3.5 Experiment; 3.5.1 Implementation; 3.5.2 Simulation on l0 Sparse Approximation; 3.5.3 Applications on Classi cation; 3.5.4 Applications on Clustering; 3.6 Conclusions and Discussions on Theoretical Properties; References; 4 Single Image Super-Resolution: From Sparse Coding to Deep Learning; 4.1 Robust Single Image Super-Resolution via Deep Networks with Sparse Prior; 4.1.1 Introduction
Text of Note
4.1.2 Related Work4.1.3 Sparse Coding Based Network for Image SR; 4.1.3.1 Image SR Using Sparse Coding; 4.1.3.2 Network Implementation of Sparse Coding; 4.1.3.3 Network Architecture of SCN; 4.1.3.4 Advantages over Previous Models; 4.1.4 Network Cascade for Scalable SR; 4.1.4.1 Network Cascade for SR of a Fixed Scaling Factor; 4.1.4.2 Network Cascade for Scalable SR; 4.1.4.3 Training Cascade of Networks; 4.1.5 Robust SR for Real Scenarios; 4.1.5.1 Data-Driven SR by Fine-Tuning; 4.1.5.2 Iterative SR with Regularization; Blurry Image Upscaling; Noisy Image Upscaling; 4.1.6 Implementation Details
Text of Note
4.1.7 Experiments4.1.7.1 Algorithm Analysis; 4.1.7.2 Comparison with State-of-the-Art; 4.1.7.3 Robustness to Real SR Scenarios; Data-Driven SR by Fine-Tuning; Regularized Iterative SR; 4.1.8 Subjective Evaluation; 4.1.9 Conclusion and Future Work; 4.2 Learning a Mixture of Deep Networks for Single Image Super-Resolution; 4.2.1 Introduction; 4.2.2 The Proposed Method; 4.2.3 Implementation Details; 4.2.4 Experimental Results; 4.2.4.1 Network Architecture Analysis; 4.2.4.2 Comparison with State-of-the-Art; 4.2.4.3 Runtime Analysis; 4.2.5 Conclusion and Future Work; References
0
8
8
8
8

SUMMARY OR ABSTRACT

Text of Note
Deep Learning through Sparse Representation and Low-Rank Modeling bridges classical sparse and low rank models-those that emphasize problem-specific Interpretability-with recent deep network models that have enabled a larger learning capacity and better utilization of Big Data. It shows how the toolkit of deep learning is closely tied with the sparse/low rank methods and algorithms, providing a rich variety of theoretical and analytic tools to guide the design and interpretation of deep learning models. The development of the theory and models is supported by a wide variety of applications in computer vision, machine learning, signal processing, and data mining. This book will be highly useful for researchers, graduate students and practitioners working in the fields of computer vision, machine learning, signal processing, optimization and statistics.

ACQUISITION INFORMATION NOTE

Source for Acquisition/Subscription Address
Ingram Content Group
Stock Number
9780128136607

OTHER EDITION IN ANOTHER MEDIUM

Title
Deep learning through sparse and low-rank modeling.
International Standard Book Number
9780128136591

TOPICAL NAME USED AS SUBJECT

Machine learning.
COMPUTERS-- General.
Machine learning.

(SUBJECT CATEGORY (Provisional

COM-- 000000

DEWEY DECIMAL CLASSIFICATION

Number
006
.
31
Edition
23

LIBRARY OF CONGRESS CLASSIFICATION

Class number
Q325
.
5

PERSONAL NAME - ALTERNATIVE RESPONSIBILITY

Fu, Yun
Huang, Thomas S.,1936-
Wang, Zhangyang

ORIGINATING SOURCE

Date of Transaction
20200822085100.0
Cataloguing Rules (Descriptive Conventions))
pn

ELECTRONIC LOCATION AND ACCESS

Electronic name
 مطالعه متن کتاب 

[Book]

Y

Proposal/Bug Report

Warning! Enter The Information Carefully
Send Cancel
This website is managed by Dar Al-Hadith Scientific-Cultural Institute and Computer Research Center of Islamic Sciences (also known as Noor)
Libraries are responsible for the validity of information, and the spiritual rights of information are reserved for them
Best Searcher - The 5th Digital Media Festival