Statistical Machine Learning, Week 3: Tensor, Gradient, and Automatic Differentiation (GradientTape)

Поделиться
HTML-код
  • Опубликовано: 3 окт 2024
  • #tensorflow #gradient #coding
    Tensors provide a unified and efficient data structure to represent and process different types of data (tabular, image, time series) in machine learning, allowing for seamless integration with mathematical operations and optimization algorithms essential for training models.
    tensors
    tensor operations
    differentiation: finding the derivative, or rate of change, of a function
    Tensors are a generalization of vectors and matrices to an arbitrary number of dimensions
    In the context of tensors, a dimension is often called an "axis"
    The number of axes of a tensor is called its "rank"
    Scalars: rank-0 tensors
    Vectors: rank-1 tensors
    Matrices: rank-2 tensors
    Higher-rank tensors are very useful for representing complex data sets

Комментарии •