A principal feature analysis

نویسندگان

چکیده

A key task of data science is to identify relevant features linked certain output variables that are supposed be modeled or predicted. To obtain a small but meaningful model, it important find stochastically independent capturing all the information necessary model predict sufficiently. Therefore, we introduce in this work framework detect linear and non-linear dependencies between different features. As will show, actually functions other do not represent further information. Consequently, reduction neglecting such conserves information, reduces noise thus improves quality model. Furthermore, smaller makes easier adopt given system. In addition, approach structures within considered This provides advantages for classical modeling starting from regression ranging differential equations machine learning. show generality applicability presented 2154 center measured classification faulty non-faulty states set up. number automatically reduced by 161 The prediction accuracy even compared trained on total second example analysis gene expression where 9513 genes 9 extracted whose levels two cell clusters macrophages can distinguished.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Feature reduction of hyperspectral images: Discriminant analysis and the first principal component

When the number of training samples is limited, feature reduction plays an important role in classification of hyperspectral images. In this paper, we propose a supervised feature extraction method based on discriminant analysis (DA) which uses the first principal component (PC1) to weight the scatter matrices. The proposed method, called DA-PC1, copes with the small sample size problem and has...

متن کامل

Feature Dimension Reduction of Multisensor Data Fusion using Principal Component Fuzzy Analysis

These days, the most important areas of research in many different applications, with different tools, are focused on how to get awareness. One of the serious applications is the awareness of the behavior and activities of patients. The importance is due to the need of ubiquitous medical care for individuals. That the doctor knows the patient's physical condition, sometimes is very important. O...

متن کامل

Orthogonal Principal Feature Selection

This paper presents a feature selection method based on the popular transformation approach: principal component analysis (PCA). It is popular because it finds the optimal solution to several objective functions (including maximum variance and minimum sum-squared-error), and also because it provides an orthogonal basis solution. However, PCA as a dimensionality reduction algorithm do not explic...

متن کامل

Convex Principal Feature Selection

A popular approach for dimensionality reduction and data analysis is principal component analysis (PCA). A limiting factor with PCA is that it does not inform us on which of the original features are important. There is a recent interest in sparse PCA (SPCA). By applying an L1 regularizer to PCA, a sparse transformation is achieved. However, true feature selection may not be achieved as non-spa...

متن کامل

Principal Feature Analysis: A Multivariate Feature Selection Method for fMRI Data

Brain decoding with functional magnetic resonance imaging (fMRI) requires analysis of complex, multivariate data. Multivoxel pattern analysis (MVPA) has been widely used in recent years. MVPA treats the activation of multiple voxels from fMRI data as a pattern and decodes brain states using pattern classification methods. Feature selection is a critical procedure of MVPA because it decides whic...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Computational Science

سال: 2022

ISSN: ['1877-7511', '1877-7503']

DOI: https://doi.org/10.1016/j.jocs.2021.101502