Advanced Ubiquitous Computing (2017-2)

“Student-teacher relationships are based on trust. Acts, which violate this trust, undermine the educational process. Your classmates and the instructor will not tolerate violations of academic integrity”

1. Course Schedule & Lecture Notes

  • Sept. 12 – Single Neuron

  • Sept. 19 – Multiple Neurons

    • Laboratory

  • Sept. 26 – Backward Propagation

  • Oct. 03 – 추석

  • Oct. 10 – Multi-layer Neural Networks

    • Laboratory

  • Oct. 17 – Optimization Technology

    • Adam

      • Diederik P. Kingma, Jimmy Ba, “Adam: A Method for Stochastic Optimization,” Proceedings of the 3rd International Conference on Learning Representations (ICLR), 2014 arXiv Link, 허주성, 권도형
    • Laboratory

  • Oct. 24 – Optimization Technology

    • Laboratory

  • Oct. 31 – Parameter Initialization

    • Xavier Initializer
      • Xavier Glorot, Yoshua Bengio, “Understanding the difficulty of training deep feedforward neural networks,” Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, PMLR 9:249-256, 2010. Link, 김영진, 김한진, 우용근
    • He Initializer
      • Kaiming He, Xiangyu Zhang, Shaoqing Ren, Jian Sun, “Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification,” IEEE International Conference on Computer Vision (ICCV), DOI: 10.1109/ICCV.2015.123, 2015 Link 기철민, 김현국

  • Nov. 07 – Parameter Initialization

    • Laboratory

  • Nov. 14 – Learning Acceleration and Overfitting Prevention

    • Batch Normalization
      • Sergey Ioffe and Christian Szegedy, “Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift,” The 32nd International Conference on Machine Learning (ICML-15), pp. 448-456, 2015 Link, 임현교, 김주봉
      • Dmytro Mishkin and Jiri Matas, “All you need is a good init,” International Conference on Learning Representations, At San Juan, Puerto Rico, USA, 2016 Link, 류시호, 최성준

  • Nov. 21 – Learning Acceleration and Overfitting Prevention

    • Dropout
      • Nitish Srivastava, Geoffrey Hinton, Alex Krizhevsky, Ilya Sutskever, Ruslan Salakhutdinov, “Dropout: A Simple Way to Prevent Neural Networks from Overfitting,” The Journal of Machine Learning Research, Vol. 15, No. 1, pp. 1929-1958, Jan. 2014. Link 지상엽, 이의혁
    • Random Search 의
      • James Bergstra, Yoshua Bengio, “Random Search for Hyper-Parameter Optimization,” Journal of Machine Learning Research, Vol. 13, No. 1, pp. 281-305, January 2012 Link, 황제민, 정진서

  • Nov. 28 – Learning Acceleration and Overfitting Prevention

    • Laboratory

  • Dec. 05 – Convolutional Neural Networks

    • Laboratory
  • Dec. 12 – [기말고사]

2. Term Project Guide

  • 텀프로젝트 제목: 유비쿼터스 플랫폼위의 지능형 소프트웨어
  • 텀프로젝트 취지: 임베디드 기기/OS에 딥러닝 모델을 적재하고, 해당 딥러닝 모델을 활용한 지능형 판단 (분류 또는 예측) 작업 수행
  • 활용 플랫폼: iPhone, Android Phone, Raspberry Pi
  • 요구 사항
    • 위 플랫폼 위에서 동작하는 응용 하나 작성
    • 해당 응용 내에 수업시간에 본인이 직접 작성한 딥러닝 모델 적재
    • 훈련(Training) 작업은 입베디드 플랫폼 이외에서 수행
    • 데이터는 어떠한 곳에서 구해도 됨 (직접 생산 데이터도 가능)
    • 딥러닝 활용 목표는 스스로 정함.

3. Course Information

4. Logistics

  • Attendance – One class absence will result in the deduction of two points out of 100 points. Three absences will not result in ten points deduction, but “failure” (i.e., grade ‘F’) in this course.
  • Exam – There will be the final exam for the evaluation of the knowledge learned from the class.
  • Book Report – Students should read one of books listed in the references, and submit a book report.
  • Home Works – Much evaluation mark will be counted.

5. Evaluation

  • Attendance (10%)
  • Term Project (20%)
  • Paper Presentation (20%)
  • Final Examination (50%)