Post

Weekly-#25 AI infra

Weekly-#25 AI infra

LLM Inference

It’s a great milestone when I finish a mature and iterable project, even it’s still in the first stage.

There’re plenty AI infra repositoreis and projects in github, but most of them are designed and developed for nvidia GPU, some of them are designed for AMD.

LLM computation industry is a huge field and market, there’re also many domestic GPU brands in china. As a result, the work space is huge to make domestic GPU show its best performance.

This field is fun, enough patience and strong coding ability are required.

In addition, Knowledge are more stable than that in other fileds because it change in less frequancy. Similarly, knowledge about CPU change less than programming language in recent decades, expecially for Python. As a result, I can put enough energy in this fields becuase it worth not only for money but also for research result. s

LLM Application

LLM application is another great direction which is related with LLM.

LLM create new ability that not appear in the world, like the invention of electricity and steam engine.

As a result, we can solve more real problems in the world by LLM, the feedback will be more direct if we success.

In addtion, LLM inference demand is built on the demand of LLM application, it last for long time only if there’s real useful LLM application.

Speaking of the specific work of LLM application, I need to have a deep understanding about the exist field or problem and LLM ability, it wroks when I integrate them together.

We are in the best age because of AI, don’t waste it.

This post is licensed under CC BY 4.0 by the author.