AI Today Podcast – AI Glossary Series: Transformer Networks

AI Today Podcast: Artificial Intelligence Insights, Experts, and Opinion - Un pódcast de AI & Data Today

Categorías:

Transformer models have proven to be especially powerful for Natural Language Processing applications and image generation, and have been popularized by models such as GPT-3, Stable diffusion, and BERT. In this episode of the AI Today podcast hosts Kathleen Walch and Ron Schmelzer define the term Transformer Networks, explain how these terms relate to AI and why it's important to know about them. Want to dive deeper into an understanding of artificial intelligence, machine learning, or big data concepts? Want to learn how to apply AI and data using hands-on approaches and the latest technologies? Check out these hand-selected books in our Suggested Reading List that can help you expand your knowledge or put your knowledge to use. Show Notes: FREE Intro to CPMAI mini course CPMAI Training and Certification Suggested Reading List AI Glossary Glossary Series: Recognition Systems, Computer Vision, ImageNet Glossary Series: Training Data, Epoch, Batch, Learning Curve Glossary Series: (Artificial) Neural Networks, Node (Neuron), Layer Glossary Series: Bias, Weight, Activation Function, Convergence, ReLU Glossary Series: Perceptron Glossary Series: Hidden Layer, Deep Learning Glossary Series: Loss Function, Cost Function & Gradient Descent Glossary Series: Backpropagation, Learning Rate, Optimizer Glossary Series: Feed-Forward Neural Network Glossary Series: OpenAI, GPT, DALL-E, Stable Diffusion Glossary Series: Natural Language Processing (NLP), NLU, NLG, Speech-to-Text, TTS, Speech Recognition What's GPT-3 and Why is it Important? Podcast

Visit the podcast's native language site