Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ML 스터디 15주차 과제 #33

Open
whitesoil opened this issue Jan 15, 2019 · 0 comments
Open

ML 스터디 15주차 과제 #33

whitesoil opened this issue Jan 15, 2019 · 0 comments

Comments

@whitesoil
Copy link
Member

Week 15 : Feature selection, PCA, P-value, Weight Initialization, Hyperparameter optimization

기한 : 1월 18일

정초이

Feature selection, PCA

  • How to select meaningful features
    • Sparse solutions with L1 regularization
    • Sequential feature selection algorithm
    • etc...
  • PCA

남궁선

P-value

  • 귀무가설
  • 영가설
  • etc...

조민지

Weight Initialization, Hyperparameter optimization

  • Weight Initialization
    • Xavier
    • He
    • etc...
  • Hyperparameter optimization
    1. Practical way
      1. 하이퍼파라미터 값의 범위를 설정한다.
      2. 설정된 범위에서 하이퍼파라미터의 값을 무작위로 추출한다.
      3. 2단계에서 샘플링한 하이퍼파라미터 값을 사용하여 학습하고, 검증데이터로 정확도를 평가한다.(단 에폭은 작게 설정한다.)
      4. 2단계와 3단계를 특정 횟수(100회 등) 반복하며, 그 정확도의 결과를 보고 하이퍼파라미터의 범위를 좁힌다.
    2. Theoretical way
      • Bayesian optimization
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant