Skip to main content
main-content

Über dieses Buch

Get up to speed with the deep learning concepts of Pytorch using a problem-solution approach. Starting with an introduction to PyTorch, you'll get familiarized with tensors, a type of data structure used to calculate arithmetic operations and also learn how they operate. You will then take a look at probability distributions using PyTorch and get acquainted with its concepts. Further you will dive into transformations and graph computations with PyTorch. Along the way you will take a look at common issues faced with neural network implementation and tensor differentiation, and get the best solutions for them.
Moving on to algorithms; you will learn how PyTorch works with supervised and unsupervised algorithms. You will see how convolutional neural networks, deep neural networks, and recurrent neural networks work using PyTorch. In conclusion you will get acquainted with natural language processing and text processing using PyTorch.
What You Will LearnMaster tensor operations for dynamic graph-based calculations using PyTorchCreate PyTorch transformations and graph computations for neural networksCarry out supervised and unsupervised learning using PyTorch Work with deep learning algorithms such as CNN and RNNBuild LSTM models in PyTorch Use PyTorch for text processing Who This Book Is For
Readers wanting to dive straight into programming PyTorch.

Inhaltsverzeichnis

Frontmatter

Chapter 1. Introduction to PyTorch, Tensors, and Tensor Operations

PyTorch has been evolving as a larger framework for writing dynamic models. Because of that, it is very popular among data scientists and data engineers deploying large-scale deep learning frameworks. This book provides a structure for the experts in terms of handling activities while working on a practical data science problem. As evident from applications that we use in our day-to-day lives, there are layers of intelligence embedded with the product features. Those features are enabled to provide a better experience and better services to the user.
Pradeepta Mishra

Chapter 2. Probability Distributions Using PyTorch

Probability and random variables are an integral part of computation in a graph-computing platform like PyTorch. Understanding probability and associated concepts are essential. This chapter covers probability distributions and implementation using PyTorch, and interpreting the results from tests.
Pradeepta Mishra

Chapter 3. CNN and RNN Using PyTorch

Probability and random variables are an integral part of computation in a graph-computing platform like PyTorch. Understanding probability and the associated concepts are essential. This chapter covers probability distributions and implementation using PyTorch, as well as how to interpret the results of a test. In probability and statistics, a random variable is also known as a stochastic variable, whose outcome is dependent on a purely stochastic phenomenon, or random phenomenon. There are different types of probability distribution, including normal distribution, binomial distribution, multinomial distribution, and the Bernoulli distribution. Each statistical distribution has its own properties.
Pradeepta Mishra

Chapter 4. Introduction to Neural Networks Using PyTorch

Deep neural network–based models are gradually becoming the backbone for artificial intelligence and machine learning implementations. The future of data mining will be governed by the usage of artificial neural network–based advanced modeling techniques. One obvious question is why neural networks are only now gaining so much importance, because it was invented in 1950s.
Pradeepta Mishra

Chapter 5. Supervised Learning Using PyTorch

Supervised machine learning is the most sophisticated branch of machine learning. It is in use in almost all fields, including artificial intelligence, cognitive computing, and language processing. Machine learning literature broadly talks about three types of learning: supervised, unsupervised, and reinforcement learning. In supervised learning, the machine learns to recognize the output; hence, it is task driven and the task can be classification or regression.
Pradeepta Mishra

Chapter 6. Fine-Tuning Deep Learning Models Using PyTorch

Deep learning models are becoming very popular. They have very deep roots in the way biological neurons are connected and the way they transmit information from one node to another node in a network model.
Pradeepta Mishra

Chapter 7. Natural Language Processing Using PyTorch

Natural language processing is an important branch of computer science. It is the study of human language by computers performing various tasks. Natural language study is also known as computational linguistics. There are two different components of natural language processing: natural language understanding and natural language generation. Natural language understanding involves analysis and knowledge of the input language and responding to it. Natural language generation is the process of creating language from input text. Language can be used in various ways. One word may have different meanings, so removing ambiguity is an important part of natural language understanding.
Pradeepta Mishra

Backmatter

Weitere Informationen

Premium Partner

BranchenIndex Online

Die B2B-Firmensuche für Industrie und Wirtschaft: Kostenfrei in Firmenprofilen nach Lieferanten, Herstellern, Dienstleistern und Händlern recherchieren.

Whitepaper

- ANZEIGE -

Best Practices für die Mitarbeiter-Partizipation in der Produktentwicklung

Unternehmen haben das Innovationspotenzial der eigenen Mitarbeiter auch außerhalb der F&E-Abteilung erkannt. Viele Initiativen zur Partizipation scheitern in der Praxis jedoch häufig. Lesen Sie hier  - basierend auf einer qualitativ-explorativen Expertenstudie - mehr über die wesentlichen Problemfelder der mitarbeiterzentrierten Produktentwicklung und profitieren Sie von konkreten Handlungsempfehlungen aus der Praxis.
Jetzt gratis downloaden!

Bildnachweise