Skip to content

Commit

Permalink
Merge pull request #1077 from pritesh2000/gram-1/cheatsheet
Browse files Browse the repository at this point in the history
extras/pytorch_cheatsheet.ipynb
  • Loading branch information
mrdbourke authored Sep 11, 2024
2 parents 8bc74bd + cab062b commit bd1ee99
Showing 1 changed file with 5 additions and 5 deletions.
10 changes: 5 additions & 5 deletions extras/pytorch_cheatsheet.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -131,7 +131,7 @@
"outputs": [],
"source": [
"# Create a random tensor\n",
"random_tensor = torch.rand(size=(3, 4)) # this will create a tensor of size 3x4 but you can manipulate the shape how you want"
"random_tensor = torch.rand(size=(3, 4)) # this will create a tensor of size 3x4 but you can manipulate the shape however you want"
]
},
{
Expand Down Expand Up @@ -270,7 +270,7 @@
"It is advised to perform training on the fastest piece of hardware you have available, which will generally be: NVIDIA GPU (`\"cuda\"`) > MPS device (`\"mps\"`) > CPU (`\"cpu\"`).\n",
"\n",
"* For more on seeing how to get [PyTorch to run on NVIDIA GPU (with CUDA)](https://pytorch.org/docs/stable/cuda.html), see [00. PyTorch Fundamentals section 2: getting PyTorch to run on the GPU](https://www.learnpytorch.io/00_pytorch_fundamentals/#2-getting-pytorch-to-run-on-the-gpu).\n",
"* For more on running PyTorch using an MPS backend (running PyTorch on Mac GPUs) [see the PyTorch documentaiton](https://pytorch.org/docs/stable/notes/mps.html). \n",
"* For more on running PyTorch using an MPS backend (running PyTorch on Mac GPUs) [see the PyTorch documentation](https://pytorch.org/docs/stable/notes/mps.html). \n",
"\n",
"> **Note:** It is advised to setup device-agnostic code at the start of your workflow."
]
Expand Down Expand Up @@ -437,7 +437,7 @@
"\n",
"How these layers stack together will depend on the problem you're working on.\n",
"\n",
"One of the most active areas of research in machine learnin is how to stack neural network layers together (and the best answer to this is constantly changing). \n",
"One of the most active areas of research in machine learning is how to stack neural network layers together (and the best answer to this is constantly changing). \n",
"\n",
"The vast majority of neural network components in PyTorch are contained within the [`torch.nn` package](https://pytorch.org/docs/stable/nn.html) (`nn` is short for neural networks)."
]
Expand Down Expand Up @@ -730,7 +730,7 @@
"And some of the most common are:\n",
"* [`nn.L1Loss`](https://pytorch.org/docs/stable/generated/torch.nn.L1Loss.html#torch.nn.L1Loss) - also referred to as MAE or [mean absolute error](https://en.wikipedia.org/wiki/Mean_absolute_error) (this loss is often used for regression problems or predicting a number such as the price of houses).\n",
"* [`nn.MSELoss`](https://pytorch.org/docs/stable/generated/torch.nn.MSELoss.html#torch.nn.MSELoss) - also referred to as L2Loss or [mean squared error](https://en.wikipedia.org/wiki/Mean_squared_error) (this loss is often used for regression problems or predicting a number such as the price of houses).\n",
"* [`nn.BCEWithLogitsLoss`](https://pytorch.org/docs/stable/generated/torch.nn.BCEWithLogitsLoss.html#torch.nn.BCEWithLogitsLoss) - also known as [binary cross entropy](https://en.wikipedia.org/wiki/Cross_entropy) this loss function is often used for binary classification probelms (classifying something as one thing or another).\n",
"* [`nn.BCEWithLogitsLoss`](https://pytorch.org/docs/stable/generated/torch.nn.BCEWithLogitsLoss.html#torch.nn.BCEWithLogitsLoss) - also known as [binary cross entropy](https://en.wikipedia.org/wiki/Cross_entropy) this loss function is often used for binary classification problems (classifying something as one thing or another).\n",
"* [`nn.CrossEntropyLoss`](https://pytorch.org/docs/stable/generated/torch.nn.CrossEntropyLoss.html#torch.nn.CrossEntropyLoss) - this loss function is often used for multi-class classification problems (classifying something as one thing or another)."
]
},
Expand Down Expand Up @@ -1118,7 +1118,7 @@
"* [Zero to Mastery Learn PyTorch course](https://dbourke.link/ZTMPyTorch) - a comprehensive yet beginner-friendly deep dive into using PyTorch for deep learning all the way from the fundamentals to deploying a model to the real-world so other people can use it.\n",
"* [PyTorch performance tuning guide](https://pytorch.org/tutorials/recipes/recipes/tuning_guide.html) - a resource from the PyTorch team on how to tune performance of PyTorch models.\n",
"* [PyTorch Extra Resources](https://www.learnpytorch.io/pytorch_extra_resources/) - a curated list of helpful resources to extend PyTorch and learn more about the engineering side of things around deep learning.\n",
"* [Effective PyTorch by vahidk](https://github.com/vahidk/EffectivePyTorch) - a GitHub repo with a fantastic overview of some of the main functionality in PyTorch in a straight-foward manner."
"* [Effective PyTorch by vahidk](https://github.com/vahidk/EffectivePyTorch) - a GitHub repo with a fantastic overview of some of the main functionality in PyTorch in a straight-forward manner."
]
}
],
Expand Down

0 comments on commit bd1ee99

Please sign in to comment.