Data Manipulation

When having PyTorch selected:

2.1.5. Saving Memory §3:

Fortunately, performing in-place operations in MXNet is easy.

Is it intentional to discuss MXNet even though PyTorch is selected for the code examples?

It is understandable. :sweat_smile:
The original examples are built by MXNet. :joy:( created by mli)

Allow me to point out a small error in Section 2.1.2
“For stylistic convenience, we can write x.sum() as np.sum(x) .” should not appear in PyTorch version because it is not possible to run np.sum(x) if x is a PyTorch tensor.

x = torch.arange(12)
np.sum(x)
---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-30-1393831a87e1> in <module>
      1 x = torch.arange(12)
----> 2 np.sum(x)

Thanks @hehao98 for pointng that out. We have already fixed that line in this commit and it will be updated with our next release.

98/5000
Is it only possible to use broadcast when the two arrays have a dimension has value equal to one?

Hi @jairo.venegas, no need to be one. You can broadcast anything :wink: This example may give you more idea!

in 2.1.4. Indexing and Slicing

X[0:2, :]
this code suppose to take the 1th and 2nd rows, but why it isn’t typed like that -> X[0:1,:]

Hi @omarkhaled850, i am not fully understand your question. Could you elaborate more on that?

thanks for responding
in the slicing code. we want to take the 1th row (index = 0) and the 2nd row (index = 1) in the example. put the code start from index 0 to index 2 (0:2).
shouldn’t it be (0:1)

Slicing is an indexing syntax that extracts a portion from the tensor. X[m:n] returns the portion of X :

  • Starting with position m
  • Up to but not including n

thanks man, "Up to but not including n" is the key that i was looking for

Hello guys,
In section 2.1.6 [Conversion to Other Python Objects], what do you mean by the line in bold; Unfortunately I can’t get it and I need help. Thanks

Converting to a NumPy tensor, or vice versa, is easy. The converted result does not share memory. This minor inconvenience is actually quite important: when you perform operations on the CPU or on GPUs, you do not want to halt computation, waiting to see whether the NumPy package of Python might want to be doing something else with the same chunk of memory.

Does it mean NumPy and PyTorch are completely independent and may have conflicts with each other?

Edit: According to PyTorch documentation:

Converting a torch Tensor to a NumPy array and vice versa is a breeze. The torch Tensor and NumPy array will share their underlying memory locations, and changing one will change the other.

It says this conversion shares the memory but you said not! Am I right?

This is probably carried over from the MXNET version, where the corresponding operation does create a copy:

asnumpy () : Returns a numpy.ndarray object with value copied from this array.

@anirudh should be able to confirm/deny this.

Thanks, @Aaron and @gphilip for raising this. Most part of the book has common text and we are trying to fix issues like these where the frameworks differ in design. Feel free to raise any other issues if you find something similar in other sections on the forum or the Github repo. Really appreciate it!

This will be fixed in the next release.

1 Like

e = torch.arange(12).reshape(2, -1, 6)
f = torch.tensor([1, 2, 3, 4]).reshape(-1, 4, 1)
e, f, e.shape, f.shape
(tensor([[[ 0, 1, 2, 3, 4, 5]],

    [[ 6,  7,  8,  9, 10, 11]]]), tensor([[[1],
     [2],
     [3],
     [4]]]), torch.Size([2, 1, 6]), torch.Size([1, 4, 1]))

e + f
tensor([[[ 1, 2, 3, 4, 5, 6],
[ 2, 3, 4, 5, 6, 7],
[ 3, 4, 5, 6, 7, 8],
[ 4, 5, 6, 7, 8, 9]],

    [[ 7,  8,  9, 10, 11, 12],
     [ 8,  9, 10, 11, 12, 13],
     [ 9, 10, 11, 12, 13, 14],
     [10, 11, 12, 13, 14, 15]]])

Hello! I had two questions from this section:

  1. Where does the term “lifted” come from? I understand “lifted” means some function that operate on real numbers (scalars) can be “lifted” to a higher dimensional or vector operations. I was just curious if this is a commonly used term in mathematics. :slight_smile:

  2. Is there a rule for knowing what the shape of a broadcasted operation may be? For Exercise #2, I tried a shape of (3, 1, 1) + (1, 2, 1) to get (3, 2, 1). I also tried (3, 1, 1, 1) + (1, 2, 1) and got (3, 1, 2, 1). It kind of gets harder to visualize how broadcasting will work beyond 3-D, so I was wondering if someone could explain why the 2nd broad operation has the shape that it has intuitively.

Thank you very much!

Lifting is commonly used for this operation in functional programming (e.g. in Haskell), probably it has some roots in lambda calculus.

1 Like

@hojaelee , During broadcasting the shape matching of the two inputs X, Y happen in reverse order i.e. starting from the -1 axis. This (i.e. -ve indexing) is also the preferred way to index ndarray or any numpy based tensors (either in PyTorch or TF) instead of using +ve indexing. This way you will always know the correct shapes.

Consider this example:

import torch
X = torch.arange(12).reshape((12))      ## X.shape = [12]
Y = torch.arange(12).reshape((1,12))    ## Y.shape = [1,12]
Z = X+Y                                 ## Z.shape = [1,12]

and contrast the above example with this below one

import torch
X = torch.arange(12).reshape((12))      ## X.shape = [12]
Y = torch.arange(12).reshape((12,1))    ## Y.shape = [12, 1]   <--- NOTE
Z = X+Y                                 ## Z.shape = [12,12]   <--- NOTE

And in both the above examples, a very simple rule is followed during broadcasting:

  1. Start from RIGHT-to-LEFT indices (i.e. -ve indexing) instead of the conventional LEFT-to-RIGHT process.
  2. If at any point, the shape values mismatch; check
    (2.1): If any of the two values are 1 then inflate this tensor in this axis with the OTHER value
    (2.2): Else, Throw ERROR(“dimension mismatch”)
  3. Else, CONTINUE moving LEFT

Hope it helps.

image
if anyone has any confusion related to broadcasting, this is how it actually looks in Numpy.
taken form python data science handbook

2 Likes

I’ve checked this information, but I have obtained a different result:
Captura de tela 2022-07-24 093759