PACKT (406)
Text Book 교재용원서 (673)
컴퓨터공학 (822)
컴퓨터 일반도서 (560)
전기,전자공학 (715)
기계공학 (201)
재료공학 (34)
에너지공학 (65)
의용공학 (40)
생명과학 (229)
물리학 (427)
지구과학 (74)
천문학 (39)
수학 (103)
통계학 (46)
경영학 (42)
산업공학 (12)
사회복지학 (5)
심리학 (247)
교육학 (2)
화학 (5)
기타 (64)
특가할인도서 (택배비별도) (87)

> > 컴퓨터공학 > 데이터마이닝

이미지를 클릭하시면 큰 이미지를 보실 수 있습니다.
Statistical Learning with Sparsity: The Lasso and Generalizations
출판사 : Chapman & Hall
저 자 : Trevor Hastie
ISBN : 9781498712163
발행일 : 2015-7
도서종류 : 외국도서
발행언어 : 영어
페이지수 : 367
판매가격 : 89,000원
판매여부 : 재고확인요망
주문수량 : [+]수량을 1개 늘입니다 [-]수량을 1개 줄입니다

My Wish List 에 저장하기
   Statistical Learning with Sparsity: The Lasso and Generalizations 목차
Table of Contents

Introduction

The Lasso for Linear Models
Introduction
The Lasso Estimator
Cross-Validation and Inference
Computation of the Lasso Solution
Degrees of Freedom
Uniqueness of the Lasso Solutions
A Glimpse at the Theory
The Nonnegative Garrote
ℓq Penalties and Bayes Estimates
Some Perspective

Generalized Linear Models
Introduction
Logistic Regression
Multiclass Logistic Regression
Log-Linear Models and the Poisson GLM
Cox Proportional Hazards Models
Support Vector Machines
Computational Details and glmnet

Generalizations of the Lasso Penalty
Introduction
The Elastic Net
The Group Lasso
Sparse Additive Models and the Group Lasso
The Fused Lasso
Nonconvex Penalties

Optimization Methods
Introduction
Convex Optimality Conditions
Gradient Descent
Coordinate Descent
A Simulation Study
Least Angle Regression
Alternating Direction Method of Multipliers
Minorization-Maximization Algorithms
Biconvexity and Alternating Minimization
Screening Rules

Statistical Inference
The Bayesian Lasso
The Bootstrap
Post-Selection Inference for the Lasso
Inference via a Debiased Lasso
Other Proposals for Post-Selection Inference

Matrix Decompositions, Approximations, and Completion
Introduction
The Singular Value Decomposition
Missing Data and Matrix Completion
Reduced-Rank Regression
A General Matrix Regression Framework
Penalized Matrix Decomposition
Additive Matrix Decomposition

Sparse Multivariate Methods
Introduction
Sparse Principal Components Analysis
Sparse Canonical Correlation Analysis
Sparse Linear Discriminant Analysis
Sparse Clustering

Graphs and Model Selection
Introduction
Basics of Graphical Models
Graph Selection via Penalized Likelihood
Graph Selection via Conditional Inference
Graphical Models with Hidden Variables

Signal Approximation and Compressed Sensing
Introduction
Signals and Sparse Representations
Random Projection and Approximation
Equivalence between ℓ0 and ℓ1 Recovery

Theoretical Results for the Lasso
Introduction
Bounds on Lasso ℓ2-error
Bounds on Prediction Error
Support Recovery in Linear Regression
Beyond the Basic Lasso

Bibliography

Author Index

Index
   도서 상세설명   

Summary

Discover New Methods for Dealing with High-Dimensional Data

A sparse statistical model has only a small number of nonzero parameters or weights; therefore, it is much easier to estimate and interpret than a dense model. Statistical Learning with Sparsity: The Lasso and Generalizations presents methods that exploit sparsity to help recover the underlying signal in a set of data.

Top experts in this rapidly evolving field, the authors describe the lasso for linear regression and a simple coordinate descent algorithm for its computation. They discuss the application of ℓ1 penalties to generalized linear models and support vector machines, cover generalized penalties such as the elastic net and group lasso, and review numerical methods for optimization. They also present statistical inference methods for fitted (lasso) models, including the bootstrap, Bayesian methods, and recently developed approaches. In addition, the book examines matrix decomposition, sparse multivariate analysis, graphical models, and compressed sensing. It concludes with a survey of theoretical results for the lasso.

In this age of big data, the number of features measured on a person or object can be large and might be larger than the number of observations. This book shows how the sparsity assumption allows us to tackle these problems and extract useful and reproducible patterns from big datasets. Data analysts, computer scientists, and theorists will appreciate this thorough and up-to-date treatment of sparse statistical modeling.

  교육용 보조자료   
작성된 교육용 보조자료가 없습니다.