Skip to content

Commit

Permalink
Merge pull request #1074 from pritesh2000/gram-1/00
Browse files Browse the repository at this point in the history
00_pytorch_fundamentals.ipynb
  • Loading branch information
mrdbourke authored Sep 11, 2024
2 parents c2ae892 + 20a51fd commit 8974543
Showing 1 changed file with 12 additions and 12 deletions.
24 changes: 12 additions & 12 deletions 00_pytorch_fundamentals.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -30,11 +30,11 @@
"\n",
"## Who uses PyTorch?\n",
"\n",
"Many of the worlds largest technology companies such as [Meta (Facebook)](https://ai.facebook.com/blog/pytorch-builds-the-future-of-ai-and-machine-learning-at-facebook/), Tesla and Microsoft as well as artificial intelligence research companies such as [OpenAI use PyTorch](https://openai.com/blog/openai-pytorch/) to power research and bring machine learning to their products.\n",
"Many of the world's largest technology companies such as [Meta (Facebook)](https://ai.facebook.com/blog/pytorch-builds-the-future-of-ai-and-machine-learning-at-facebook/), Tesla and Microsoft as well as artificial intelligence research companies such as [OpenAI use PyTorch](https://openai.com/blog/openai-pytorch/) to power research and bring machine learning to their products.\n",
"\n",
"![pytorch being used across industry and research](https://raw.githubusercontent.com/mrdbourke/pytorch-deep-learning/main/images/00-pytorch-being-used-across-research-and-industry.png)\n",
"\n",
"For example, Andrej Karpathy (head of AI at Tesla) has given several talks ([PyTorch DevCon 2019](https://youtu.be/oBklltKXtDE), [Tesla AI Day 2021](https://youtu.be/j0z4FweCy4M?t=2904)) about how Tesla use PyTorch to power their self-driving computer vision models.\n",
"For example, Andrej Karpathy (head of AI at Tesla) has given several talks ([PyTorch DevCon 2019](https://youtu.be/oBklltKXtDE), [Tesla AI Day 2021](https://youtu.be/j0z4FweCy4M?t=2904)) about how Tesla uses PyTorch to power their self-driving computer vision models.\n",
"\n",
"PyTorch is also used in other industries such as agriculture to [power computer vision on tractors](https://medium.com/pytorch/ai-for-ag-production-machine-learning-for-agriculture-e8cfdb9849a1).\n",
"\n",
Expand Down Expand Up @@ -66,7 +66,7 @@
"| **Creating tensors** | Tensors can represent almost any kind of data (images, words, tables of numbers). |\n",
"| **Getting information from tensors** | If you can put information into a tensor, you'll want to get it out too. |\n",
"| **Manipulating tensors** | Machine learning algorithms (like neural networks) involve manipulating tensors in many different ways such as adding, multiplying, combining. | \n",
"| **Dealing with tensor shapes** | One of the most common issues in machine learning is dealing with shape mismatches (trying to mixed wrong shaped tensors with other tensors). |\n",
"| **Dealing with tensor shapes** | One of the most common issues in machine learning is dealing with shape mismatches (trying to mix wrong shaped tensors with other tensors). |\n",
"| **Indexing on tensors** | If you've indexed on a Python list or NumPy array, it's very similar with tensors, except they can have far more dimensions. |\n",
"| **Mixing PyTorch tensors and NumPy** | PyTorch plays with tensors ([`torch.Tensor`](https://pytorch.org/docs/stable/tensors.html)), NumPy likes arrays ([`np.ndarray`](https://numpy.org/doc/stable/reference/generated/numpy.ndarray.html)) sometimes you'll want to mix and match these. | \n",
"| **Reproducibility** | Machine learning is very experimental and since it uses a lot of *randomness* to work, sometimes you'll want that *randomness* to not be so random. |\n",
Expand Down Expand Up @@ -501,7 +501,7 @@
"id": "LhXXgq-dTGe3"
},
"source": [
"`MATRIX` has two dimensions (did you count the number of square brakcets on the outside of one side?).\n",
"`MATRIX` has two dimensions (did you count the number of square brackets on the outside of one side?).\n",
"\n",
"What `shape` do you think it will have?"
]
Expand Down Expand Up @@ -697,7 +697,7 @@
"\n",
"And machine learning models such as neural networks manipulate and seek patterns within tensors.\n",
"\n",
"But when building machine learning models with PyTorch, it's rare you'll create tensors by hand (like what we've being doing).\n",
"But when building machine learning models with PyTorch, it's rare you'll create tensors by hand (like what we've been doing).\n",
"\n",
"Instead, a machine learning model often starts out with large random tensors of numbers and adjusts these random numbers as it works through data to better represent it.\n",
"\n",
Expand Down Expand Up @@ -984,7 +984,7 @@
"\n",
"Some are specific for CPU and some are better for GPU.\n",
"\n",
"Getting to know which is which can take some time.\n",
"Getting to know which one can take some time.\n",
"\n",
"Generally if you see `torch.cuda` anywhere, the tensor is being used for GPU (since Nvidia GPUs use a computing toolkit called CUDA).\n",
"\n",
Expand Down Expand Up @@ -1901,7 +1901,7 @@
"id": "bXKozI4T0hFi"
},
"source": [
"Without the transpose, the rules of matrix mulitplication aren't fulfilled and we get an error like above.\n",
"Without the transpose, the rules of matrix multiplication aren't fulfilled and we get an error like above.\n",
"\n",
"How about a visual? \n",
"\n",
Expand Down Expand Up @@ -1988,7 +1988,7 @@
"id": "zIGrP5j1pN7j"
},
"source": [
"> **Question:** What happens if you change `in_features` from 2 to 3 above? Does it error? How could you change the shape of the input (`x`) to accomodate to the error? Hint: what did we have to do to `tensor_B` above?"
"> **Question:** What happens if you change `in_features` from 2 to 3 above? Does it error? How could you change the shape of the input (`x`) to accommodate to the error? Hint: what did we have to do to `tensor_B` above?"
]
},
{
Expand Down Expand Up @@ -2188,7 +2188,7 @@
"\n",
"You can change the datatypes of tensors using [`torch.Tensor.type(dtype=None)`](https://pytorch.org/docs/stable/generated/torch.Tensor.type.html) where the `dtype` parameter is the datatype you'd like to use.\n",
"\n",
"First we'll create a tensor and check it's datatype (the default is `torch.float32`)."
"First we'll create a tensor and check its datatype (the default is `torch.float32`)."
]
},
{
Expand Down Expand Up @@ -2289,7 +2289,7 @@
}
],
"source": [
"# Create a int8 tensor\n",
"# Create an int8 tensor\n",
"tensor_int8 = tensor.type(torch.int8)\n",
"tensor_int8"
]
Expand Down Expand Up @@ -3139,7 +3139,7 @@
"source": [
"Just as you might've expected, the tensors come out with different values.\n",
"\n",
"But what if you wanted to created two random tensors with the *same* values.\n",
"But what if you wanted to create two random tensors with the *same* values.\n",
"\n",
"As in, the tensors would still contain random values but they would be of the same flavour.\n",
"\n",
Expand Down Expand Up @@ -3220,7 +3220,7 @@
"It looks like setting the seed worked. \n",
"\n",
"> **Resource:** What we've just covered only scratches the surface of reproducibility in PyTorch. For more, on reproducibility in general and random seeds, I'd checkout:\n",
"> * [The PyTorch reproducibility documentation](https://pytorch.org/docs/stable/notes/randomness.html) (a good exericse would be to read through this for 10-minutes and even if you don't understand it now, being aware of it is important).\n",
"> * [The PyTorch reproducibility documentation](https://pytorch.org/docs/stable/notes/randomness.html) (a good exercise would be to read through this for 10-minutes and even if you don't understand it now, being aware of it is important).\n",
"> * [The Wikipedia random seed page](https://en.wikipedia.org/wiki/Random_seed) (this'll give a good overview of random seeds and pseudorandomness in general)."
]
},
Expand Down

0 comments on commit 8974543

Please sign in to comment.