The non-technical AI guy is on fire: He doesn't have an ML degree, but he got an offer from DeepMind
For the rest of the learning process, he developed a learning framework. With a rough plan, a "learn all sub-domains of AI in a year" + "learn one sub-domain every 3 months" plan was born.
It was a difficult start. The first AI sub-domain studied, neural style migration (NST), took more than 3 months. With this in mind, I optimised the learning plan by interspersing micro-learning cycles within the larger framework of learning one sub-domain every 3 months.
There are two types of micro-learning cycles.
1. Input mode: massive intake of information. The goal of this mode is to gain a deeper understanding of the structure of the sub-domain through blogs, videos, etc., or to gain a deeper understanding of a topic through research papers or books.
2. Output mode: sharing what you have learned. Make YouTube videos, set up GitHub projects or write blog posts, update Linkedln, share relevant content on Twitter and Discord.
Of course, you can't leave your main job behind. In balancing work and study, he has shown his strong will power.
He keeps an almost "crazy" rhythm: he wakes up, writes code for 2 hours, takes a walk, comes back from the walk, goes to work on Microsoft, finishes his work, takes a 30-minute nap, and works for another 2-3 hours before going to bed.
He sums up this part of his experience in 3 points: strong perseverance, the right mindset and naps are golden.
To teach people how to fish, he describes in detail his methodology for learning ML details such as NST and GAN.
Keep learning
Reading books and cutting-edge papers is the most direct way to acquire ML-related knowledge.
In the process of learning about neural style migration (NST), DeepDream, generative adversarial networks (GAN), NLP & Transformers, reinforcement learning (RL), etc., he read a lot of cutting-edge and niche papers, among which the Transformers-related papers provided him with the inspiration to make a popular GAT with PyTorch Part of his inspiration.
Graph ML
The beginner-friendly GAT was a big hit, and was recommended for the GNN lectures at Cambridge University.
OTHER NEWS
-
- To be more energy efficient, almost all Samsung home appliances will support Wi-Fi in the future
- By 31 Aug,2022
-
- Apple AR headset called what? Trademark application documents reveal the name: Reality One, Reality Pro, etc.
- By 29 Aug,2022
-
- Meta Quest Pro real high-end VR headset revealed, was left in the hotel
- By 13 Sep,2022
-
- Thirty-year drug company veteran Derek Lowe slams AlphaFold: "Pure self-importance" to make drugs based on structural predictions
- By 23 Aug,2022
-
- Senior Google software engineer fired after claiming company's AI chatbot was self-aware
- By 25 Jul,2022
-
- Valuation of the company behind Stable Diffusion climbs to $6.9 billion, just one month after the project's launch
- By 13 Sep,2022