Snips & Tips
Snips & Tips
  • Snips & Tips
  • 📊Data Science
    • Polars Dataframe Library
    • Loading large data
    • Pandas
      • Pandas Apply Function
    • Apache Spark
      • Custom Transformer
    • Data Visualizations
    • Jupyter Notebooks
      • Jupyter Notebook Structure
    • Probability
    • Statistics
      • Statistical Tests
      • Z - Test
      • Hypothesis Testing
    • SQL
      • SQL Tips
      • Creating new columns
  • ☘️Deep Learning
    • Backpropagation in Deep Learning
    • Pytorch Early Stopping
    • Optimizers
  • Pytorch Tensor Shapes
  • 🔖Machine Learning
    • Handling Imbalanced Dataset
    • Time Series Forecasting
      • Hierarchical Time Series Forecasting
      • Facebook Prophet
      • Misc
    • Handling high dimensionality data
      • Weight of evidence and Information value
    • Debugging ML Models
    • Feature Engineering
      • Time Series
      • Outlier Detection
      • Categorical Encoding
      • Feature Scaling
  • 🐲DSA
    • Arrays
  • 🖥️WEB DEV
    • Typescript
    • React State Management
    • Redux Boilerplate
    • Intercept a HTTP request or response
    • this keyword
    • Array Methods
    • Throttle Debounce
    • Media Queries
    • React Typeahead Search
  • 💻Product Analytics
    • Product Sense
    • Customer Segmentation
  • 🖥️Terminal
    • Terminal Commands
    • Jupyter Notebook 2 HTML
  • 🪛Tools and Libraries
    • Web Based
    • Databases
  • 🚟Backend
    • Fast API CRUD
    • Scalable APIs
  • 💸Quant Finance
    • Misc
    • Factor Investing
  • 🎮Game Dev
    • Misc
  • 🛠️Architecture
    • Docker
    • AWS CDK
  • 🦠Artificial Intelligence
    • AI Engg
Powered by GitBook
On this page

Was this helpful?

Edit on GitHub

Pytorch Tensor Shapes

x = torch.randint(0, 10, (1, 3, 5))  # Values will be 0 to 9
em = nn.Embedding(100, 2)  # Embedding with vocab_size 100 and embedding_dim 2

output = em(x[0])
print("X :", x)
print("--"*20)
print("Output shape:", output.shape)
print("Output :", output)

Output

Notice the shape of embedding 1st 2 dimension is same as input's 2nd and 3rd dim. we can say that each input row is converted to a batch of size equal to number of features(5) and embeding dimension(2)

X : tensor([[[9, 2, 7, 5, 8],
         [6, 8, 0, 6, 6],
         [1, 9, 7, 8, 5]]])
----------------------------------------
Output shape: torch.Size([3, 5, 2])
Output : tensor([[[-0.3330, -1.0566],
         [ 0.0844,  0.8795],
         [ 0.3895, -0.0850],
         [ 0.3029, -3.0282],
         [ 0.5957,  0.4328]],

        [[-1.4674, -0.6537],
         [ 0.5957,  0.4328],
         [-1.3697, -2.1440],
         [-1.4674, -0.6537],
         [-1.4674, -0.6537]],

        [[-0.3171,  1.6504],
         [-0.3330, -1.0566],
         [ 0.3895, -0.0850],
         [ 0.5957,  0.4328],
         [ 0.3029, -3.0282]]], grad_fn=<EmbeddingBackward0>)

x = torch.randint(0, 10, (2, 3, 5))  # Values will be 0 to 9
em = nn.Embedding(100, 2)  # Embedding with vocab_size 100 and embedding_dim 2

output = em(x[0])
print("X :", x)
print("--"*20)
print("Output shape:", output.shape)
print("Output :", output)

Output

X : tensor([[[0, 0, 6, 1, 7],
         [0, 1, 4, 7, 8],
         [0, 2, 3, 2, 2]],

        [[1, 5, 9, 5, 1],
         [7, 3, 7, 2, 3],
         [0, 1, 9, 2, 1]]])
----------------------------------------
Output shape: torch.Size([3, 5, 2])
Output : tensor([[[-0.8984,  0.9534],
         [-0.8984,  0.9534],
         [ 1.5472,  0.4761],
         [ 0.1619, -0.3839],
         [ 0.9034,  0.4409]],

        [[-0.8984,  0.9534],
         [ 0.1619, -0.3839],
         [-0.6406,  0.8670],
         [ 0.9034,  0.4409],
         [ 0.7268, -0.2206]],

        [[-0.8984,  0.9534],
         [ 0.1042, -0.5466],
         [-0.9524,  0.1458],
         [ 0.1042, -0.5466],
         [ 0.1042, -0.5466]]], grad_fn=<EmbeddingBackward0>)
PreviousOptimizersNextHandling Imbalanced Dataset

Last updated 1 month ago

Was this helpful?