Academic Year |
2025Year |
School/Graduate School |
Graduate School of Advanced Science and Engineering (Master's Course) Division of Advanced Science and Engineering Informatics and Data Science Program |
Lecture Code |
WSN22001 |
Subject Classification |
Specialized Education |
Subject Name |
Data Management |
Subject Name (Katakana) |
データ マネジメント |
Subject Name in English |
Data Management |
Instructor |
MORIMOTO YASUHIKO |
Instructor (Katakana) |
モリモト ヤスヒコ |
Campus |
Higashi-Hiroshima |
Semester/Term |
1st-Year, Second Semester, 3Term |
Days, Periods, and Classrooms |
(3T) Weds5-8:ENG 106 |
Lesson Style |
Lecture |
Lesson Style (More Details) |
Face-to-face |
|
Credits |
2.0 |
Class Hours/Week |
4 |
Language of Instruction |
B
:
Japanese/English |
Course Level |
6
:
Graduate Advanced
|
Course Area(Area) |
25
:
Science and Technology |
Course Area(Discipline) |
02
:
Information Science |
Eligible Students |
Master course and Doctor course students |
Keywords |
|
Special Subject for Teacher Education |
|
Special Subject |
|
Class Status within Educational Program (Applicable only to targeted subjects for undergraduate students) | |
---|
Criterion referenced Evaluation (Applicable only to targeted subjects for undergraduate students) | |
Class Objectives /Class Outline |
In this class, we will learn about explainable AI, so-called XAI. AI created by machine learning from big data has already exceeded human ability as for recognition ability and classification ability. On the other hand, many high-performance models based on neural networks are black boxes, and there is no way for humans to judge the validity of the model's output. Therefore, XAI technology has become popular, and this technology for discovering explanations from models has something in common with data mining technology. In this class, we will explain XAI technology and related data mining technology from the viewpoint of discovering useful knowledge from models. Data mining technology, which extracts useful knowledge from large-scale data, has developed rapidly since the latter half of the 1990s, and has come to be involved in decision-making in all aspects of our daily life. So far, this class has explained the main data mining techniques, but the main content has already become commoditized, and it can be said that it is essential knowledge rather than specialized knowledge. In response to such changes, explanations of major data mining techniques are handled in the special course "Data Mining" in the Faculty of Informatics and Data Science. Those who want to learn these major data mining techniques should take the undergraduate class. For this reason, it is desirable to take this class after taking the "Data Mining" class in the undergraduate course. You can take this lecture even if you have not taken "Data Mining".
This class is planned to be conducted in a PBL style (problem-solving type) that combines explanations, investigations, and exercises. Create and submit reports for assignments or exercises in classes. |
Class Schedule |
1-2: Guidance and preparation for exercises The instructor explains the outline of this course, in which guidance will be given on how to take the PBL-style (problem-solving) class, which combines explanation, exploration, and exercises. In addition, we will prepare the Python and the Python environment to be used during the exercises. 3-4: Basics of Machine Learning and Artificial Intelligence and Overview of XAI Learn the basics of "machine learning and artificial intelligence" that you need to know in order to study this course. (Supervised learning, decision trees, neural networks, etc.) Based on these, learn the background of the XAI technology and its technical overview. In the exercises, analyze the importance of features and the relationship between features and predicted values. 5-6: Ensemble learning model and local explanation Learn about high-performance prediction models using ensemble learning, and practice extracting explanations for the reasons for the model's judgments using a local explanation tool called LIME. 7-8: SHAP Learn about the Shapley value (contribution to prediction) based on cooperative game theory and its calculation algorithm. In addition, practice local explanations using SHAP. 9-10: Global explanation Learn about global explanation techniques that explain the operating and judgment principles of the model itself, rather than explaining each individual instance, and practice using global explanation tools. 11-12: XAI for image classification AI and XAI are also used for data other than relational formats. In this lesson, learn about AI techniques for image classification and XAI techniques for it, and practice on them. 13-14: XAI for text classification Continue learning about data other than relational formats. In this lesson, learn about AI techniques for text classification and XAI techniques for it, and practice on them. 15: Generative AI and its profiling In recent years, personality trait analysis known as the OCEAN Big Five has come to be used among information processing technologies. Models that combine images and language are also rapidly developing. We will look into the future and see if it is possible to explain the operating principles of AI, including generative systems, from such new approaches. |
Text/Reference Books,etc. |
We will introduce papers and reference books in the classes. We will provide original handouts that relate class subjects. |
PC or AV used in Class,etc. |
Handouts, moodle |
(More Details) |
|
Learning techniques to be incorporated |
PBL (Problem-based Learning)/ TBL (Team-based Learning) |
Suggestions on Preparation and Review |
Generally, no specific preparation is required, but here are some advices on reviewing, exploring, and practicing of each unit.
3-4: Basics of Machine Learning and Artificial Intelligence and XAI Overview In particular, decision trees and neural networks will be necessary in later lectures and exercises, so please review their principles thoroughly. -- 5-6: Ensemble Learning Models and Local Explanations In many cases, decision trees are used as weak learners in ensemble learning. Although decision trees are simple models, by using them in ensemble models, it is good to realize through exercises that they have performance comparable to deep learning models. In addition, we plan to use LIME, a representative tool of XAI, in the exercises. If you can master this, you will be able to apply XAI in practical work when you go out into the world. -- 7-8: SHAP SHAP is a representative tool of XAI as well as LIME. If you can master this tool, just like LIME, if you can master this tool, you will be able to apply XAI in practical work when you go out into the world. Please practice thoroughly so that you can use it yourself. -- 9-10: Global Explanation Specifically, we will explain the concepts of Permutation Feature Importance and Partial Dependence, which are known as XAI technologies, and perform actual analysis. If necessary, please use these keywords to search for literature, explanations, and calculation examples for reference. -- 11-14: In the exercises, we will use LIME, Grad-CAM, Integrated Gradients, BERT, etc. In addition to the ones used in this exercise, the field of images and text is a dynamic field in which new, more powerful models are emerging every year. Use this exercise as a stepping stone to try XAI analysis with a new model. |
Requirements |
|
Grading Method |
Report assignment: 80% Participation in each assignment: 20% |
Practical Experience |
|
Summary of Practical Experience and Class Contents based on it |
|
Message |
|
Other |
|
Please fill in the class improvement questionnaire which is carried out on all classes. Instructors will reflect on your feedback and utilize the information for improving their teaching. |