Skip to content
View ljj-007's full-sized avatar

Highlights

  • Pro

Block or report ljj-007

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Please don't include any personal information such as legal names or email addresses. Maximum 100 characters, markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse
ljj-007/README.md

🌟 Hi, I'm Jiajun Liu.

AI Researcher & LLMs Developer

GitHub followers Profile Views

🎯 About Me

"less is more"

  • πŸ”­ I'm currently pursuing my PhD in Computer Science and Technology at Southeast University.
  • 🌱 My study interests lie in Knowledge Distillation in large language models (LLMs) and Knowledge Graphs (KGs).
  • ❀️ Love programming and new technologies!
  • πŸ‘― Enjoy doing interesting work with interesting friends!

πŸ’» Tech Stack

πŸ€– AI & Machine Learning

Python PyTorch TensorFlow scikit-learn Pandas NumPy

🎨 Web Development

React Node.js JavaScript HTML5 CSS3

πŸ› οΈ Development Tools

Git Docker VS Code Jupyter


πŸ’‘ "less is more"

If you are interested in my research, please contact me or join my algorithm research team!

Pinned Loading

  1. seukgcode/IterDE seukgcode/IterDE Public

    [AAAI 2023] IterDE: An Iterative Knowledge Distillation Framework for Knowledge Graph Embeddings

    Python 9 1

  2. seukgcode/IncDE seukgcode/IncDE Public

    [AAAI 2024] Towards Continual Knowledge Graph Embedding via Incremental Distillation

    Python 16 1

  3. PtCoding PtCoding Public

    [Pytorch Training Template] A framework for start to train with Pytorch quickly.

    Python 1

  4. seukgcode/FastKGE seukgcode/FastKGE Public

    [IJCAI 2024] Fast and Continual Knowledge Graph Embedding via Incremental LoRA

    Python 9 2

  5. Awesome-Knowledge-Distillation-of-LLMs Awesome-Knowledge-Distillation-of-LLMs Public

    Forked from Tebmer/Awesome-Knowledge-Distillation-of-LLMs

    This repository collects papers for "A Survey on Knowledge Distillation of Large Language Models". We break down KD into Knowledge Elicitation and Distillation Algorithms, and explore the Skill & V…

  6. distillm distillm Public

    Forked from jongwooko/distillm

    Official PyTorch implementation of DistiLLM: Towards Streamlined Distillation for Large Language Models (ICML 2024)

    Python